• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    ST-Trader: A Spatial-Temporal Deep Neural Network for Modeling Stock Market Movement

    2021-04-13 06:55:46XiuruiHouKaiWangChengZhongandZhiWeiSeniorMemberIEEE
    IEEE/CAA Journal of Automatica Sinica 2021年5期

    Xiurui Hou, Kai Wang, Cheng Zhong, and Zhi Wei, Senior Member, IEEE

    Abstract—Stocks that are fundamentally connected with each other tend to move together. Considering such common trends is believed to benefit stock movement forecasting tasks. However,such signals are not trivial to model because the connections among stocks are not physically presented and need to be estimated from volatile data. Motivated by this observation, we propose a framework that incorporates the inter-connection of firms to forecast stock prices. To effectively utilize a large set of fundamental features, we further design a novel pipeline. First,we use variational autoencoder (VAE) to reduce the dimension of stock fundamental information and then cluster stocks into a graph structure (fundamentally clustering). Second, a hybrid model of graph convolutional network and long-short term memory network (GCN-LSTM) with an adjacency graph matrix(learnt from VAE) is proposed for graph-structured stock market forecasting. Experiments on minute-level U.S. stock market data demonstrate that our model effectively captures both spatial and temporal signals and achieves superior improvement over baseline methods. The proposed model is promising for other applications in which there is a possible but hidden spatial dependency to improve time-series prediction.

    I. INTRODUCTION

    THERE is strong evidence that stock prices of firms that interact with each other move together due to several reasons. First, exchange-traded funds (ETFs), such as S&P 500 and NASDAQ, track the prices of a basket of stocks.When people trade those funds, all the underlying stocks are traded simultaneously, which causes common fluctuations of those stock prices [1]. Second, most professional portfolio managers are specialized in a couple of strategies and these strategies often involve a similar set of stocks. For example,value investing [2] tilts to firms with high earning-to-price ratio, while momentum strategy focuses on firms with higher returns during the past year. On the one hand, any fundamental shock can affect the prices of a group of stocks together. The ongoing COVID-19 pandemic struck the traveling industry harder than the technology sector, so we observed stocks within the traveling industry fall significantly and simultaneously while those of tech companies did not drop much. On the other hand, portfolio managers may adjust their positions due to idiosyncratic reasons, for example, the price target predicted by their own model changes. Third,some companies have cooperative relationships, like Apple and Nvidia. If one of them has good or bad news, the effect on the other one could be reflected in the stock price. To our best knowledge, however, we have not seen an algorithm effectively incorporating these hidden dependencies among firms into the stock price forecasting task.

    Although the interaction among companies is not difficult to observe, it is not easy to have it cooperated in stock prices forecasting tasks due to three reasons: 1) there are too many fundamental variables to select from (usually the total is more than 1000). Extra financial expertise is needed to filter out the key variables; 2) although the key variables are selected, the interaction imposed by those variables is not trivial to model due to nonlinearity and chaos; 3) the way that interaction contributes to the final forecasting goal is the biggest obstacle to utilize the fundamental information because such static information has different frequencies and scales from timevarying price variable.

    Deep learning is well deployed in grid and sequence structured data, like image recognition in autonomous driving[3] and natural language processing [4]. However, graphformed data is more common but more complex in the realworld, like social relationships, sensor networks in smart cities, and stock connections in the financial market. Recently,there has been a surge of attention on graph representation, for example, link prediction, graph classification, and node classification. Motivated by the graph representation employed by previous studies, we propose a framework that models the hidden dependency among firms as a graph.

    In the field of graph representation learning, tasks with onhand graph information are well-studied. For example, [5]used convolutional neural networks (GCNs) to improve recommendation systems; [6] used GCNs to learn material properties from the connection of atoms directly to predict density. However, those frameworks cannot be directly applied to the stock price forecasting since it is mainly a timeseries problem. Even though the stocks can be seen as nodes in a graph, edges among nodes are not trivial to define because there is no spatial locality attribute for each stock.Besides, forecasting stock price is a challenging task due to its tumultuous nature, which prevents the transfer learning from time-series tasks with apparent seasonality or trend. Due to such irregularities, the direct application of existing graphrelated models to the stock market is not appropriate.

    In this work, we propose a hybrid deep learning pipeline,VAE-GCN-LSTM, to incorporate the graph structured relationship among firms into time-series forecasting tasks.The main contributions of this paper can be summarized as follows:

    1) We design a variational autoencoder (VAE) to learn the lower-dimension latent features of companies’ fundamental variables to calculate more meaningful distance among companies, which helps construct the graph network.

    2) We develop a hybrid deep neural network of graph convolutional network and long-short term memory network(GCN-LSTM) to model both the graph structured interaction among stocks and the stock price fluctuations on the timeline.

    3) To evaluate the contribution, and the improvement of additional features, and the proposed method, we conduct comprehensive experiments on both predicting numerical stock price and binary stock price movement on a real dataset.

    We consider the largest 87 firms listed in the United States in our experiment. These firms are included in the S&P 100 index and we only incorporate 87 out of 100 firms due to the data availability. Fig. 1 shows the underlying connections among these firms. In Section III we introduce how we take the connections among firms into account in the price forecasting task, and the actual performance is presented in Section IV.

    II. RELATED WORK

    A. Classic Approaches in Stock Market

    Many factors and firm characteristics are demonstrated to be effective in forecasting stock prices [7]. For example,financial practitioners and fund managers have been following value investing strategies after, or even prior to, the publication of “The Intelligent Investor” [8] in 1965. Since then, exploring effective factors became a hot topic and a slew of factors have been explored. Examples of these factors include the book-to-market value of a firm [9], price-toearnings ratio [10], relative strength trading strategies that focus on the recent stock return of a firm [11], and the profitability and investment of a firm [12]. These factors possess valuable predictive power in terms of forecasting a firm’s future return.

    The advent of advanced machine learning methods makes the stock return forecasting problem a more competitive task.Time-series and cross-section regressions are no longer the only toolkit we are able to utilize. For instance, [13] applied support vector machine (SVM) on forecasting NIKKEI 255 index’s movement on a weekly basis. In another independent work, [14] predicted the stock price movement of Taiwan(China) companies. Reference [15] used Random Forest to forecast the future stock prices and find it outperforms both artificial neural networks (ANNs) and SVM.

    B. Deep Learning Approaches in Stock Market

    Deep learning has achieved exciting performance in many areas, e.g., image recognition task [16] and natural language processing task [17]. The application of deep learning also includes stock price forecasting. Both the absolute price or return (numerical prediction) and the price movement (binary classification) are popular forecasting goals for researchers.Fully connected neural networks are applied to predict future stock price in Chinese market [18] and Canadian markets [19],and to predict stock return of Japan Index [20] and S&P500[21]. Long-short term memory (LSTM), which is designed to work on time-series tasks, is used in this forecasting problem[22], [23] since the stock price can be seen as a time series sequence. In the comparison results, LSTM has been demonstrated to outperform the fully connected neural networks.

    Hybrid models, which take various sources of information to enrich the predictive power of the conventional machine learning algorithms, make the forecasting task more promising in recent years. For example, [24], [25] proposed a combination of wavelet technique and neural networks.Reference [26] introduced a hybrid model that combines autoregressive moving average models and artificial neural networks. A more recent work by [27] ensembled machine learning methods and financial technical analysis. In some related fields, like crowdfunding [28] and user intended action prediction [29], the hybridization of different types of neural networks has been applied successfully potentially due to its ability in accommodating heterogeneous input features.Reference [30] proposed a hybrid model combining long-short term memory and deep neural network (LSTM-DNN) for a stock forecasting task. This hybrid model, LSTM-DNN, first integrates static firm fundamental features into a time-series forecasting task. Refer to [31]–[34] for more hybrid models on forecasting problem. However, no matter how the neural network structure changes in the literature above, none of them takes the interactions among firms into consideration in the prediction task. It is intriguing to extract these implicit interactions and use them as input for the forecasting tasks.

    C. Convolutional Neural Networks in Graph-Structure Data

    Recently, applying convolutional neural networks (CNNs)to graphs with arbitrary structures has caught people’s attention. Two main directions are being explored in the literature: 1) the definition of spatial convolution is generalized in [35], [36]; 2) generalizing CNNs to 3-dimensional data as a multiplication in graph Fourier domain is discussed in [35], [37], [38] by the way of convolution theorem. Using geodesic polar coordinates, the authors define the convolution operation on meshes. Therefore, this method is suitable for manifolds and cannot be directly applied to graphs with arbitrary structures. The spatial approach proposed in [36] has more potential possibilities in generalizing CNNs to arbitrary graphs. It has three steps: first,select a target node; second, construct the neighborhood of target node; third, normalize the selected sub-graph by ordering the neighbors. The normalized sub-graphs are then fed into 1-dimension Euclidean CNN. Since there is not a natural ordering property in graphs, either temporal or spatial,it has to be imposed by a labeling procedure. The spectral framework to solve this issue is first introduced by [37] and is described in Section III-D. The main disadvantage of this method is the high computational complexity, O(n2). To overcome this problem, [38] provides strictly localized filters with a linear complexity O(|E|). The first order approximation of the proposed spectral filter is adopted by [39] in a semisupervised node classification task. Thus, we also use the spectral filters introduced in [38] in our framework because of the efficiency and denote the convolution operation as ?G.

    Applying graph convolution operation in time-series tasks is demonstrated to be helpful in some studies. Reference [40]developed a spatio-temporal GCN model for traffic forecasting. The conventional method for traffic forecasting is to do time-series analysis for each traffic entity, like a specific highway or a city road. The more natural thought is that if two roads are close, they have high probability to experience the same volume of traffic. In their study, the GCN model can perform convolution operation with much faster training speed and fewer parameters than the traditional CNN model that is more suitable for grid structured data, e.g., images.Reference [41] also proposed a spatio-temporal GCN model for human body skeletons based action recognition and the improvement is significant. Reference [42] used GCN in dynamic texture recognition by extracting low-level features.Reference [43] combined LSTM and convolutional LSTM for capturing both time sequencing features and map sequencing features. There are some attempts to combine time-series forecasting and graph structured convolution operation, like[44] and [45] forecasting wind speed and solar radiation by enriching the time-series with wind farm distances and solar site distances. The wind farms located nearby are supposed to experience much the same wind speed and direction. Wind forecasting tasks benefit from the geographical information via graph convolutional network.

    III. METHOD

    In this section, westart with the pr oblem formulation of prediction task with spatial and temporal dependencies. Then,we use VAE to reduce dimensions of fundamental feature space and generate the spatial graph structure for the selected stocks. After that, we introduce a spatial-temporal model,GCN-LSTM, to predict the future stock prices.

    A. Problem Formulation

    B. Variational Autoencoder

    Fig. 2. Spatial-temporal modeling using GCN-LSTM framework for unspecified spatial graph structure. The area shaded in green is the VAE that reduces the dimension of the fundamental feature to learn more meaningful distance among stocks. The network below it is the constructed graph based on the learnt distance. The vertical panel to the right of VAE presents the convolution neighbors of each node. The area shaded in yellow is the network of a LSTM cell. The time-series inputs enriched with fundamental signals by convolution operation are fed into a LSTM network for final predictions.

    C. Long-Short Term Memory (LSTM)

    In Fig. 2, the construction of LSTM is represented in the area shaded in yellow color which is located in the upper-right corner. A compact form of the equations for the forward pass of a LSTM unit with a forget gate is represented below:

    D. Graph Convolutional Networks

    E. ST-Trader: Spatial-Temporal Modeling Using GCN-LSTM Framework

    We propose to incorporate the spatial signal into time-series prediction with graph convolutional network. The global spatial dependency learnt via VAE is represented by the adjacent matrix, A, calculated from the features from lowdimensional latent space. For notational simplicity, let?Gxtdenotes a convolution operation on xtwith filters (kernel size:dh×T ) that are functions of the graph Laplacian L, as noted in(14). By replacing the original inputs with convolutionapplied inputs, the equations of GCN-LSTM cell are given as follows:

    In our setting, W.,h∈RK×dh×dhare the Chebyshev coefficients that defines the support of the graph convolutional kernels.W.,h∈RK×dh×dhdetermines the number of parameters that is independent to the number of nodes in the graph. Parameter determines the number of neighbors used to compute the aggregated states for any target node so that it also determines the communication overhead in a distributed computing setting. The detailed algorithm description is shown in Algorithm 1.

    Algorithm 1 Training Process for the ST-Trader F ∈RN×M M N Input: . Fundamental feature matrix with variables for stocks;X ∈RN×T T N. Time-series feature matrix with look-back window for stocks;γ2=0.1?=0.5 256 K=3 E=1000 h=16 d=3; ; minibatch size ; polynomial order ; number of epochs ; the hidden space dimension ; network depth .? ψ θ Parameters: , of VAE; of GCN-LSTM.? , ψ ← Initialize parameters;repeat g ←??,ψL(?,ψ;F)(Gradients of minibatch estimator of (3));?,ψ ← g 1e?5 Update parameters using gradients with learning rate =and Adam optimizer [48];until convergence of parameters ( );A ?,ψ The adjacent matrix for the edge information in Distance of latent features derived by the VAE using parameters ;G ←?,ψ

    ?L ←Calculate rescaled Laplacian using A;θ ←Initialize Initialize parameters;e ←1;e ≥E while do X′← ?G apply convolution operator on X using (14);?y ←LS T M(X′);Update , , and in by gradient descent using (1) with learning rate = and RMSprop optimizer [49];e ←e+1 Wx.Wh. b θ 1e?5

    IV. ExPERIMENT

    A. Data Description

    Since we consider the spatial dependency among firms, the number of our training samples is divided by the number of firms. For example, suppose we have 10 firms and 100 observations for each firm. The total number of samples is 1000 if we take firms independently but the number drops to 100 if we consider the relations among firms. To obtaining enough samples, we use minute-level stock data of 87 firms from S&P 100 composite in 2010 due to the availability of data. The number of total minute-observations is 97 890, and we split the whole dataset into batches using the sliding window. We also check the robustness of our results using five-minute-interval stock prices, which guarantees enough samples for at least one epoch training and testing. For fiveminute-interval, the sample sizes of training, validation, and testing set are 16384, 2944, 2944, respectively. The used fundamental variables and stock tickers are presented in Appendix. For categorical variables such as SEC and SIC, we use their one-hot encoding as input for VAE to learn the latent feature.

    B. Experimental Settings

    For one-minute-level data, the testing period is one month after the training period. For example, if the testing is Feb 2010, the training data would be Jan 2010. The last testing period is Dec 2010. Thus, we have 11 testing periods and the number of samples for each month is listed in Table I.

    Algorithms studying on daily data may cover a period of many years, since one year has around 250 trading dates (data points) only. However, as shown in Table I one month can have more than 7000 minute-level samples/data points. We note that the months in 2010 cover different market scenarios,such as uptrend (e.g., March), downtrend (e.g., August), and mixed (e.g., June) ones. Therefore, the minute-level data of year 2010 are representative of different market scenarios and sufficient for model validation.

    To demonstrate the benefit of incorporating the spatial dependency among stocks on price forecasting, we consider the following baseline methods: 1) LR: For the classical linear regression model, we treat the historical time series prices as explanatory variables and the price of the next time point as the response variable; 2) FCNN: The fully-connected neural network which captures the non-linear relationship between time-series features; 3) LSTM: Long-short term memory neural network which contributes partially to the proposed method; 4) ecldn_ST-Trader: “ecldn” means calculating theEuclidean distance between a pair of stocks using the original fundamental variables. The spatial relationship in this setting is expected to be much noisier than using VAE; 5) idsty_STTrader: “idsty” means “industry”. The adjacent matrix for this method is derived from the industry category.Aij=1 means company i and company j are in the same industry category. The proposed model is denoted as vae_ST-Trader when compared to those baselines methods because it differs in deriving the adjacent matrix via VAE. The purpose of studying baseline methods 4) and 5) is to evaluate the ability of VAE to extract latent features from high dimension feature space. The network structure and the hyperparameters for all methods, are tuned using the validation set and the final performance results are derived on the testing set.

    TABLE I NUMBER OF SAMPLES FOR EACH MONTH

    We apply all methods mentioned above to forecast two targets: the numerical stock price and the binary price movement indicator (the label is 1 if price goes up from last time point and 0 if price goes down from last time point).Since deep neural networks are not stable when predicting unbounded numerical results, we scale both the training set and testing set using MIN-MAX normalization (see (20)) by the maximum and minimum value of the training set.

    C. Evaluation Metrics

    For the numerical stock price prediction, we adopt widely used metrics for real-valued prediction problems [50]–[53].They are defined as follows:

    1) Mean absolute value percentage error (MAPE):

    MdAPE sits around the median value of the data and is thus more robust to outliers compared to MAPE.

    For the binary price movement prediction, we use area under roc curve (AUC) and other following metrics, (TP for true positive, TN for true negative, FP for false positive, and FN for false negative):

    D. Results

    1) Predicting Stock Price: Table II presents the MAPE and MdAPE for all testing periods. The stock price has been demonstrated to have extensive outliers because MdAPE is usually less than MAPE. The methods enriched with spatial information achieve better prediction results than temporalonly models. The proposed model, vae_ST-Trader,outperforms baselines across the board. Moreover, LSTM is more desirable than LR and FCNN in most batches.Interestingly, idsty_ST-Trader with only industry information is more preferable than ecldn_ST-Trader, which incorporates much more fundamental information. One possibility is that simple Euclidean distance calculated from all fundamental variables brings more noise to the spatial signal because it assigns equal weight to each variable. The contribution of each variable to the final prediction is hard to be quantified by the linear model. Clearly, extracting the latent interaction(spatial distance on the latent features) among firms using VAE benefits the prediction accuracy substantially.

    2) Prediction Stock Price Movement: Fig. 3 presents the Accuracy and AUC scores of different methods on binary movement prediction chronologically. The Efficient-market hypothesis1https://en.wikipedia.org/wiki/Efficient-market_hypothesisstates that asset prices reflect all available information, and so there is not much space for algorithms to forecast stock prices. This hypothesis is supported by the poor accuracy in our study (Fig. 3) and in other literature. Although many investors apply value-investing strategies, which tie the market price of a stock to its underlying fundamental value,many other investors keep adopting technical analysis; they make trading decisions based on reading charts of the historical price trends. Therefore, there is still room for the methods to improve their predictive power if they do not take the influence of the technical-analysis trader on stock price into account. The proposed model is enriched by the extracted relationship among the firms so that it can better capture the trend signal compared to baseline methods. This advantage is reflected by both Accuracy and AUC scores.

    The flash crash on May 6, 2010 is an example of an extreme short-term price movement in the market. Around 2:30 p.m.EST on May 6, 2010, the Dow Jones Industrial Average fellmore than 1000 points in just 10 minutes2https://en.wikipedia.org/wiki/2010_flash_crash.. It was the biggest drop in history at that point. Despite trillions of dollars in equity being wiped out, the market recovered to its pre-crisis level by the end of the day. Analyzing the causes of such events is beyond the scope of this paper. However, it has a profound effect on the stability of our algorithm since these extreme events can hardly be thought of as shocks from fundamental information. Incorporating fundamental connections in the prediction task may lower the prediction accuracy under this particular scenario. By avoiding such events, we may expect vae_ST-Trader to perform better in predicting the longer time-interval stock price, e.g., day-to-day or week-toweek. However, we are not able to do experiments on that due to the lack of sufficient number of observations.

    TABLE II MAPE (MDAPE) FOR ONE-MINUTE-INTERVAL PREDICTION

    Fig. 3. ACC and AUC comparison for different methods across different testing months.

    3) The Number of Neighbors to Communicate With: We compare performance of the proposed model on different granularity and different parameter K in Table III. We denote one-minute-interval and five-minute-interval price forecasting as ? t=1 and ? t=5. The superscript 1 and 5 highlight the best scores for ?t=1 and ?t=5. The results for ?t=1 are aggregated across all testing months. There are two results worth mentioning. First, the outcomes of ? t=5 are better than?t=1for all evaluation metrics. This result is as expected because the price movement of five-minute-interval is much less noisy than one-minute-interval due to a couple of reasons:i) Rare events like the flash crash can recover so quickly that five-minute-level data can almost screen out such events; ii)The recorded stock price is bounced back and forth between the bid and ask quote and the price fluctuation in five-minutelevel is less likely to be affected by the bid-ask bounce. If we have enough price observations on a longer interval, the predictive power of the proposed model can be expected toimprove.

    TABLE III BINARY PRICE MOVEMENT PREDICTION WITH DIFFERENT PARAMETER K FOR ONE-MINUTE-INTERVAL AND FIVE-MINUTE-INTERVAL

    Second, for the communication overhead parameter K,which is the number of nodes any target node i should exchange signals with in order to derive its local states, we use different K to explore how the communication affects the performance in different granularity. One-minute-interval achieves best performance when K=3 and five-minuteinterval achieves best performance when K=5, which means a finer granularity prefers a lighter communication to its neighbors. We give one possible explanation. For one-minuteinterval, where the fluctuation of price is more random, and hence the dependency is less reliable, the communication with more neighbors brings more noise in forecasting. While along with the time interval increasing, the price trend becomes more stable and the common fluctuation is more promising so that the infusion of neighborhood signals can be more informative. Thus, the number of neighbors for supporting the center node is a key hyperparameter to tune for a specific time-series forecasting task.

    V. CONCLUSION

    In this paper, we propose a spatial-temporal neural network framework GCN-LSTM, to utilize the spatial dependency or the latent interaction among firms in forecasting the stock price movement. The stock market has never been treated as a graph since there is not an inborn geographical location for stock entities. However, there is strong evidence that the interactions among firms affect the stock price movement.Experimental results show that our model outperforms other state-of-the-art methods on the real-world minute-level stock price data. Fundamental features represented in a spatial structure contribute to the forecasting accuracy improvement.For future directions, we plan to investigate how the combination of fundamental variables and fiscal reports,which can be seen as a dynamic cross-section assessment of a company, contributes to predicting the stock market trend.More practical time-series applications with potential spatial dependency should be explored under the proposed modeling framework. The advanced approaches [54], [55] to the fine tuning of hyper-parameters of the proposed framework should be explored.

    ACKNOwLEDGMENT

    The authors would like to thank Sophia Chen for proofreading.

    APPENDIX

    TABLE IV FIRM FUNDAMENTAL VARIABLES

    TABLE V STOCK TICKER LIST

    Table V (Continued)

    日本黄大片高清| 久久久久免费精品人妻一区二区| 水蜜桃什么品种好| 九九久久精品国产亚洲av麻豆| 中文亚洲av片在线观看爽| 国产成人freesex在线| 日韩成人伦理影院| 亚洲人成网站高清观看| 国产在线一区二区三区精 | 一二三四中文在线观看免费高清| 久久人人爽人人爽人人片va| 亚洲欧美成人综合另类久久久 | 水蜜桃什么品种好| 午夜老司机福利剧场| 日韩av在线大香蕉| 欧美性猛交黑人性爽| 一本—道久久a久久精品蜜桃钙片 精品乱码久久久久久99久播 | 日韩av在线免费看完整版不卡| 免费av观看视频| 午夜久久久久精精品| 久久精品久久久久久久性| 一本一本综合久久| 免费观看性生交大片5| 免费大片18禁| 我要看日韩黄色一级片| 一级毛片电影观看 | 国产黄色小视频在线观看| 久久精品影院6| 九色成人免费人妻av| 国产亚洲91精品色在线| 国产成人福利小说| 国产精品一区二区三区四区免费观看| 欧美一区二区精品小视频在线| 久久精品综合一区二区三区| 久久精品人妻少妇| 国产免费又黄又爽又色| 亚洲内射少妇av| 国产精品电影一区二区三区| 搡老妇女老女人老熟妇| 一个人看视频在线观看www免费| 成人午夜精彩视频在线观看| 欧美高清成人免费视频www| 欧美潮喷喷水| 久久久a久久爽久久v久久| 免费黄网站久久成人精品| videos熟女内射| 国产精品野战在线观看| 亚洲三级黄色毛片| 天堂网av新在线| 日本一本二区三区精品| 国产综合懂色| 亚洲精品一区蜜桃| 亚洲精品456在线播放app| 国产精品一二三区在线看| 久久久国产成人精品二区| 国产精品伦人一区二区| 免费看日本二区| 高清毛片免费看| 人妻系列 视频| 国产精品国产三级国产专区5o | 三级男女做爰猛烈吃奶摸视频| 亚洲经典国产精华液单| 精品国产一区二区三区久久久樱花 | 亚洲欧美精品专区久久| 久久久久久九九精品二区国产| 国内精品美女久久久久久| 午夜福利在线观看免费完整高清在| 日本免费在线观看一区| 日本免费在线观看一区| 亚洲成人av在线免费| 中文亚洲av片在线观看爽| 国产女主播在线喷水免费视频网站 | 少妇猛男粗大的猛烈进出视频 | 色网站视频免费| 白带黄色成豆腐渣| 日韩,欧美,国产一区二区三区 | 天美传媒精品一区二区| 欧美激情在线99| 国产免费视频播放在线视频 | 精华霜和精华液先用哪个| 婷婷色麻豆天堂久久 | 国产精品熟女久久久久浪| 国产精品嫩草影院av在线观看| 最近2019中文字幕mv第一页| 免费在线观看成人毛片| 久久久国产成人精品二区| 国产精品一区www在线观看| 51国产日韩欧美| 国产视频内射| 极品教师在线视频| 中文在线观看免费www的网站| 又黄又爽又刺激的免费视频.| 啦啦啦观看免费观看视频高清| 国产精品电影一区二区三区| 亚洲人成网站在线观看播放| 大又大粗又爽又黄少妇毛片口| 欧美一级a爱片免费观看看| 少妇的逼水好多| 成人高潮视频无遮挡免费网站| 一个人看视频在线观看www免费| 亚洲成色77777| 国产极品精品免费视频能看的| 人妻夜夜爽99麻豆av| 日韩av不卡免费在线播放| 久久欧美精品欧美久久欧美| 亚洲欧美精品专区久久| 欧美bdsm另类| 又粗又硬又长又爽又黄的视频| 有码 亚洲区| 丰满乱子伦码专区| 我要看日韩黄色一级片| 少妇裸体淫交视频免费看高清| 丝袜美腿在线中文| 一区二区三区免费毛片| 精品久久久久久久人妻蜜臀av| 波多野结衣巨乳人妻| 看免费成人av毛片| 亚洲中文字幕一区二区三区有码在线看| 久久久久久九九精品二区国产| 国产av在哪里看| 热99在线观看视频| 国产成人福利小说| 超碰av人人做人人爽久久| 校园人妻丝袜中文字幕| 午夜福利在线观看吧| 男人和女人高潮做爰伦理| 少妇熟女欧美另类| 午夜福利成人在线免费观看| 最近最新中文字幕大全电影3| 少妇的逼水好多| 国产av码专区亚洲av| 日本免费一区二区三区高清不卡| 国产午夜精品久久久久久一区二区三区| www.色视频.com| 国产成人精品婷婷| 亚洲熟妇中文字幕五十中出| 噜噜噜噜噜久久久久久91| 日韩欧美精品免费久久| 99久国产av精品国产电影| 特大巨黑吊av在线直播| videos熟女内射| kizo精华| 三级经典国产精品| 国产精品一区二区性色av| 久久精品国产自在天天线| 国产一区亚洲一区在线观看| 亚洲av电影在线观看一区二区三区 | 99久久精品国产国产毛片| 国产真实乱freesex| 亚洲美女搞黄在线观看| 色视频www国产| 国产69精品久久久久777片| 蜜桃久久精品国产亚洲av| 亚洲三级黄色毛片| 国产精品.久久久| 亚洲国产高清在线一区二区三| 寂寞人妻少妇视频99o| 免费看光身美女| 久久热精品热| 男女下面进入的视频免费午夜| 高清日韩中文字幕在线| 欧美一区二区国产精品久久精品| 欧美成人一区二区免费高清观看| 亚洲国产色片| 婷婷色av中文字幕| 午夜精品一区二区三区免费看| 久久亚洲国产成人精品v| 亚洲第一区二区三区不卡| 久久久久精品久久久久真实原创| 国国产精品蜜臀av免费| 精品免费久久久久久久清纯| 亚洲人与动物交配视频| 美女高潮的动态| 国产成人精品一,二区| 欧美激情国产日韩精品一区| 欧美日韩综合久久久久久| 又粗又爽又猛毛片免费看| www日本黄色视频网| 九色成人免费人妻av| 国产精品女同一区二区软件| 六月丁香七月| 又爽又黄无遮挡网站| 又黄又爽又刺激的免费视频.| 欧美人与善性xxx| 成人亚洲欧美一区二区av| 国内少妇人妻偷人精品xxx网站| 精品一区二区免费观看| 变态另类丝袜制服| 91精品伊人久久大香线蕉| 免费黄色在线免费观看| 国产综合懂色| 国产探花极品一区二区| 欧美一区二区精品小视频在线| 我的女老师完整版在线观看| 亚洲av电影在线观看一区二区三区 | 国产精品久久久久久久久免| 免费人成在线观看视频色| 熟妇人妻久久中文字幕3abv| 国产av码专区亚洲av| 97人妻精品一区二区三区麻豆| 床上黄色一级片| 一区二区三区免费毛片| 干丝袜人妻中文字幕| 一级毛片aaaaaa免费看小| 成人亚洲欧美一区二区av| 黄片无遮挡物在线观看| 亚洲国产欧美在线一区| 99久久无色码亚洲精品果冻| 99热全是精品| 久久久久久久亚洲中文字幕| 黄色一级大片看看| 又粗又爽又猛毛片免费看| 有码 亚洲区| 亚洲四区av| 在线播放国产精品三级| av在线亚洲专区| 日韩一本色道免费dvd| 国产伦精品一区二区三区视频9| a级毛片免费高清观看在线播放| 亚洲va在线va天堂va国产| 婷婷色av中文字幕| 黄色欧美视频在线观看| 大香蕉久久网| 精品人妻视频免费看| 夫妻性生交免费视频一级片| 村上凉子中文字幕在线| 国产一区亚洲一区在线观看| 国产免费视频播放在线视频 | 少妇裸体淫交视频免费看高清| 99久久精品国产国产毛片| 在线播放国产精品三级| 国产日韩欧美在线精品| 黄色配什么色好看| 亚洲真实伦在线观看| 久久久久久久午夜电影| 国产亚洲av嫩草精品影院| 午夜精品国产一区二区电影 | 午夜福利在线在线| 国产精品人妻久久久影院| 久久韩国三级中文字幕| 99久国产av精品| 国产午夜精品一二区理论片| 欧美最新免费一区二区三区| 亚洲成人精品中文字幕电影| 精品久久国产蜜桃| 99久国产av精品国产电影| 精品人妻一区二区三区麻豆| 看免费成人av毛片| 欧美日本亚洲视频在线播放| 亚洲精品国产成人久久av| 亚洲最大成人手机在线| 别揉我奶头 嗯啊视频| 尾随美女入室| 久久热精品热| 中文精品一卡2卡3卡4更新| 亚洲精品国产av成人精品| 亚洲美女搞黄在线观看| 熟妇人妻久久中文字幕3abv| 波多野结衣高清无吗| 又粗又硬又长又爽又黄的视频| 久久午夜福利片| 一卡2卡三卡四卡精品乱码亚洲| 日日摸夜夜添夜夜添av毛片| 精品一区二区三区人妻视频| 99热网站在线观看| 色播亚洲综合网| 亚洲欧美精品综合久久99| 日韩国内少妇激情av| 国产亚洲5aaaaa淫片| 一级毛片电影观看 | 老女人水多毛片| 国产老妇女一区| 国产私拍福利视频在线观看| 午夜老司机福利剧场| 亚洲美女搞黄在线观看| 九草在线视频观看| 久久99精品国语久久久| 99九九线精品视频在线观看视频| 国产成人aa在线观看| 精品国内亚洲2022精品成人| 久久热精品热| 亚洲精品国产成人久久av| 免费在线观看成人毛片| 国产午夜精品一二区理论片| 成人毛片a级毛片在线播放| 在线免费观看的www视频| 午夜免费激情av| 又爽又黄a免费视频| 亚洲国产高清在线一区二区三| 久久久久久伊人网av| 国产精品久久久久久av不卡| 在线播放国产精品三级| 三级男女做爰猛烈吃奶摸视频| 男女下面进入的视频免费午夜| 欧美激情在线99| 欧美成人午夜免费资源| 国产精品一区二区三区四区免费观看| 少妇被粗大猛烈的视频| 一个人免费在线观看电影| 国产精品久久久久久久久免| 国内精品一区二区在线观看| 亚洲av成人精品一二三区| 欧美成人a在线观看| 国产乱人偷精品视频| 欧美三级亚洲精品| 97超视频在线观看视频| 国产淫片久久久久久久久| 欧美97在线视频| 久久久亚洲精品成人影院| 亚洲最大成人av| 久久草成人影院| 网址你懂的国产日韩在线| 日日撸夜夜添| 99在线视频只有这里精品首页| 中文字幕久久专区| 中文天堂在线官网| 色综合站精品国产| 成人鲁丝片一二三区免费| 少妇高潮的动态图| 成人国产麻豆网| 国语对白做爰xxxⅹ性视频网站| 国产一区有黄有色的免费视频 | 欧美一区二区国产精品久久精品| 热99re8久久精品国产| 色5月婷婷丁香| 日本色播在线视频| 91久久精品国产一区二区成人| 亚洲国产精品成人久久小说| 日韩欧美国产在线观看| 永久网站在线| 男女边吃奶边做爰视频| 国产精品国产三级国产专区5o | av在线天堂中文字幕| 少妇熟女欧美另类| 国产精品99久久久久久久久| 黄片wwwwww| 美女内射精品一级片tv| 亚洲人成网站高清观看| 成人毛片60女人毛片免费| 舔av片在线| 中文字幕免费在线视频6| 亚洲电影在线观看av| 狂野欧美激情性xxxx在线观看| 久久久国产成人精品二区| 老司机影院毛片| 中文精品一卡2卡3卡4更新| 久久久久性生活片| 国产精品99久久久久久久久| 最新中文字幕久久久久| 午夜老司机福利剧场| 天堂网av新在线| 国产高潮美女av| 激情 狠狠 欧美| 中文字幕免费在线视频6| 国产精品一区二区性色av| av视频在线观看入口| 精品国内亚洲2022精品成人| 男女啪啪激烈高潮av片| 我要看日韩黄色一级片| 亚洲av电影不卡..在线观看| 国产高清不卡午夜福利| 日本猛色少妇xxxxx猛交久久| 韩国av在线不卡| 日韩,欧美,国产一区二区三区 | 欧美极品一区二区三区四区| 最近中文字幕2019免费版| 国产精品,欧美在线| 国产美女午夜福利| 一个人看视频在线观看www免费| 成人国产麻豆网| 国产亚洲av嫩草精品影院| 欧美变态另类bdsm刘玥| 婷婷色综合大香蕉| 国产亚洲av嫩草精品影院| 欧美高清成人免费视频www| 热99在线观看视频| 亚洲国产精品成人久久小说| 国产精品99久久久久久久久| 国产伦精品一区二区三区视频9| 亚洲国产精品成人久久小说| 欧美人与善性xxx| 亚洲怡红院男人天堂| 精品人妻熟女av久视频| 国产探花极品一区二区| 亚洲怡红院男人天堂| 精品人妻熟女av久视频| 久久久久国产网址| 美女黄网站色视频| 欧美xxxx性猛交bbbb| 欧美激情久久久久久爽电影| 中文精品一卡2卡3卡4更新| 欧美3d第一页| 午夜福利在线观看免费完整高清在| 国产一区二区亚洲精品在线观看| 久久精品久久久久久久性| 三级男女做爰猛烈吃奶摸视频| 高清av免费在线| 欧美丝袜亚洲另类| 精品久久久噜噜| 一级黄色大片毛片| 亚洲精品国产成人久久av| 午夜精品国产一区二区电影 | 精品熟女少妇av免费看| www.色视频.com| 午夜福利在线观看吧| 天堂av国产一区二区熟女人妻| 国产精品福利在线免费观看| 国产成人a∨麻豆精品| 中文字幕久久专区| av免费观看日本| 精品欧美国产一区二区三| 亚洲无线观看免费| 我的女老师完整版在线观看| 青春草国产在线视频| 国产精品1区2区在线观看.| 丰满乱子伦码专区| 九九爱精品视频在线观看| 22中文网久久字幕| 观看美女的网站| 国产大屁股一区二区在线视频| 亚洲真实伦在线观看| 国语对白做爰xxxⅹ性视频网站| 中国美白少妇内射xxxbb| 菩萨蛮人人尽说江南好唐韦庄 | 国产综合懂色| 国产精品国产三级专区第一集| 18禁在线无遮挡免费观看视频| 可以在线观看毛片的网站| 97人妻精品一区二区三区麻豆| 1024手机看黄色片| 搞女人的毛片| 高清av免费在线| 亚洲欧美日韩高清专用| 国产免费男女视频| 亚洲自偷自拍三级| 久久草成人影院| 午夜福利网站1000一区二区三区| 亚洲欧美日韩卡通动漫| 亚洲三级黄色毛片| 97人妻精品一区二区三区麻豆| 亚洲精品国产av成人精品| 欧美激情国产日韩精品一区| 久久久久久久久久黄片| 国产单亲对白刺激| 成人毛片60女人毛片免费| 91精品伊人久久大香线蕉| 91精品一卡2卡3卡4卡| 国产精品国产三级国产av玫瑰| 啦啦啦啦在线视频资源| 一区二区三区高清视频在线| 99国产精品一区二区蜜桃av| 国产成人精品一,二区| 少妇被粗大猛烈的视频| 少妇人妻精品综合一区二区| 三级男女做爰猛烈吃奶摸视频| 国产av不卡久久| 美女高潮的动态| 91久久精品电影网| 国产成人a∨麻豆精品| 91精品一卡2卡3卡4卡| 高清日韩中文字幕在线| 青春草国产在线视频| 免费观看在线日韩| 国产免费视频播放在线视频 | 国产精品不卡视频一区二区| av免费在线看不卡| 精品久久久久久成人av| 午夜福利在线在线| 能在线免费看毛片的网站| 亚洲性久久影院| 亚洲国产欧美在线一区| 一卡2卡三卡四卡精品乱码亚洲| 亚洲精品亚洲一区二区| 最近手机中文字幕大全| 国产精品人妻久久久久久| 国产一区亚洲一区在线观看| 亚洲av电影不卡..在线观看| 日韩av在线免费看完整版不卡| 我要搜黄色片| 99久久精品一区二区三区| 国产精品一及| 中文字幕免费在线视频6| 国产极品天堂在线| 亚洲国产精品成人久久小说| 免费av不卡在线播放| 午夜亚洲福利在线播放| 国产探花在线观看一区二区| 国产精品av视频在线免费观看| 麻豆av噜噜一区二区三区| 欧美精品一区二区大全| 日本一二三区视频观看| 亚洲av二区三区四区| 日本av手机在线免费观看| 国产在线男女| 精品午夜福利在线看| 高清毛片免费看| 亚洲精品日韩av片在线观看| 免费黄色在线免费观看| 国产精品综合久久久久久久免费| 最后的刺客免费高清国语| 人妻制服诱惑在线中文字幕| 欧美性感艳星| 国产在线男女| 国产成人精品婷婷| 亚洲美女搞黄在线观看| 黄片无遮挡物在线观看| 国产黄色视频一区二区在线观看 | 亚洲精品影视一区二区三区av| 91狼人影院| 欧美一区二区国产精品久久精品| 成人美女网站在线观看视频| 草草在线视频免费看| 久热久热在线精品观看| 国产精品国产三级专区第一集| 插阴视频在线观看视频| 久久鲁丝午夜福利片| 亚洲国产欧美人成| 久久久久久久亚洲中文字幕| 亚洲欧洲国产日韩| 中文字幕亚洲精品专区| 日本五十路高清| 久久精品夜色国产| 国产精品综合久久久久久久免费| 一本久久精品| 国语自产精品视频在线第100页| 99九九线精品视频在线观看视频| 可以在线观看毛片的网站| 国产欧美另类精品又又久久亚洲欧美| 欧美另类亚洲清纯唯美| 日本色播在线视频| 啦啦啦啦在线视频资源| 日产精品乱码卡一卡2卡三| 午夜激情欧美在线| 麻豆久久精品国产亚洲av| 男插女下体视频免费在线播放| 波野结衣二区三区在线| 欧美性感艳星| 尾随美女入室| 中国国产av一级| kizo精华| 精品久久国产蜜桃| 大又大粗又爽又黄少妇毛片口| 高清日韩中文字幕在线| 99久久成人亚洲精品观看| 搡女人真爽免费视频火全软件| 午夜福利成人在线免费观看| 免费观看a级毛片全部| 国产精品久久久久久久久免| 2021天堂中文幕一二区在线观| 麻豆av噜噜一区二区三区| 免费看a级黄色片| av女优亚洲男人天堂| 中文天堂在线官网| 成人av在线播放网站| 欧美三级亚洲精品| 亚洲精品国产av成人精品| 在线天堂最新版资源| 久久韩国三级中文字幕| 日韩,欧美,国产一区二区三区 | 国产色爽女视频免费观看| 在线免费观看的www视频| 观看免费一级毛片| 黑人高潮一二区| 天堂影院成人在线观看| 成人国产麻豆网| 国产精品永久免费网站| 男女下面进入的视频免费午夜| 18禁动态无遮挡网站| 哪个播放器可以免费观看大片| 欧美丝袜亚洲另类| 国产爱豆传媒在线观看| 97热精品久久久久久| 国产成人freesex在线| 日本av手机在线免费观看| 两性午夜刺激爽爽歪歪视频在线观看| 99九九线精品视频在线观看视频| 看黄色毛片网站| 亚洲熟妇中文字幕五十中出| 欧美一级a爱片免费观看看| 变态另类丝袜制服| 黄色欧美视频在线观看| 亚洲av免费在线观看| 国产亚洲av片在线观看秒播厂 | 两性午夜刺激爽爽歪歪视频在线观看| 国产成人福利小说| 成人综合一区亚洲| 床上黄色一级片| 欧美变态另类bdsm刘玥| 色哟哟·www| 亚洲无线观看免费| 日本av手机在线免费观看| 人妻制服诱惑在线中文字幕| 欧美激情在线99| 在线观看66精品国产| 一区二区三区乱码不卡18| 国产一级毛片七仙女欲春2| 1024手机看黄色片| 神马国产精品三级电影在线观看| 亚洲一区高清亚洲精品| 少妇丰满av| 中文字幕免费在线视频6| www.色视频.com| 国产久久久一区二区三区| 在线免费观看的www视频| 一级爰片在线观看| 亚洲内射少妇av| 日韩欧美在线乱码| 日本爱情动作片www.在线观看| 亚洲欧美日韩东京热| 亚洲av熟女| 一个人看的www免费观看视频| 一区二区三区免费毛片| 老司机影院毛片| 男人的好看免费观看在线视频| 午夜福利在线观看免费完整高清在|