• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A Multi-model Approach for Soft Sensor Development Based on Feature Extraction Using Weighted Kernel Fisher Criterion*

    2014-03-25 09:11:15呂業(yè)楊慧中
    關(guān)鍵詞:楊慧

    (呂業(yè))(楊慧中)**

    Key Laboratory of Advanced Process Control for Light Industry of Jiangnan University, Wuxi 214122, China

    A Multi-model Approach for Soft Sensor Development Based on Feature Extraction Using Weighted Kernel Fisher Criterion*

    Lü Ye(呂業(yè))and YANG Huizhong(楊慧中)**

    Key Laboratory of Advanced Process Control for Light Industry of Jiangnan University, Wuxi 214122, China

    Multi-model approach can significantly improve the prediction performance of soft sensors in the process with multiple operational conditions. However, traditional clustering algorithms may result in overlapping phenomenon in subclasses, so that edge classes and outliers cannot be effectively dealt with and the modeling result is not satisfactory. In order to solve these problems, a new feature extraction method based on weighted kernel Fisher criterion is presented to improve the clustering accuracy, in which feature mapping is adopted to bring the edge classes and outliers closer to other normal subclasses. Furthermore, the classified data are used to develop a multiple model based on support vector machine. The proposed method is applied to a bisphenol A production process for prediction of the quality index. The simulation results demonstrate its ability in improving the data classification and the prediction performance of the soft sensor.

    feature extraction, weighted kernel Fisher criterion, classification, soft sensor

    1 INTRODUCTION

    Petrochemical process is a typical continuous industrial process which is multivariable, nonlinear and uncertain in nature. The process operational data is featured with multi-modes, high disturbance and time-dependent dynamics [1]. Therefore, using a single model to describe a process may result in poor prediction performance and generalization ability [2, 3]. A multi-model approach with object properties considered can solve such problems. Unfortunately, traditional clustering algorithms may lead to overlapping phenomenon in subclasses so that edge classes and outliers cannot be effectively handled. Linear discriminant analysis (LDA) enables different subclasses to have maximum discriminant by searching for an optimal projection vector. This technology is mainly applied to image processing and face recognition for improving classification accuracy [4-8]. Kernel Fisher discriminant analysis (KFDA) is a nonlinear extension of LDA, which is more efficient in solution of nonlinear classification problems [9-14]. However, due to the structural limitation of KFDA, i.e., the definition of between-class scatter matrix is less than optimal, it is still difficult to accurately classify the data when edge classes and outliers coexist [15].

    In order to further improve the prediction performance of the multi-model based on traditional clustering methods, a multi-model approach based on feature extraction using weighted kernel Fisher discriminant analysis (WKFDA) is introduced in this paper. First of all, based on the Fisher criterion, a new weighted function with variable weight is proposed to rectify the within-class scatter matrix and between-class scatter matrix. Secondly, the category information measurement is put forward to describe the scatter degree of within-class and between-class for each dimensional component. This index is applied to determine the number of feature vectors that should be included in the optimal projection matrix. Furthermore, the original data is successfully classified into multiple sub-classes based on the extracted projection feature. Then, the support vector machine (SVM) is employed to establish the sub-models using all the samples belonging to that sub-class [16-20]. Finally, a multi-model based soft sensor is obtained by using a switcher to combine the SVM sub-models. The proposed method is evaluated on a bisphenol A production process to indicate the real-time phenol composition at the dissolution tank outlet of a crystallization unit. The simulation results demonstrate the effectiveness of the proposed method.

    2 WEIGHTED KERNEL FISHER DISCRIMINANT ANALYSIS

    2.1 Kernel Fisher criterion

    LDA searches for an optimal discriminant vector in a lower-dimension feature space to maximize the ratio of between-class scatter matrix over the withinclass scatter matrix. After projection of the original high dimensional data to the feature space, it becomes much easier to classify because the samples belonging to different sub-classes are more clearly separated, while the samples within-class are more compact.

    Assume X is an n-dimensional data set which consists of N samples. It can be divided into C sub-classes, represented by Xi( i= 1,2,… ,C), which is composed of Nisamples. The between-class scatter matrix Sband the within-class scatter matrix Sωare given by [21, 22]

    where m0is the overall mean of X, miand Piare the mean and prior probability of the subclass Xi, respectively.

    The objective function based on Fisher criterion can be expressed as

    where woptcan be obtained by solving a generalized eigenproblem specified by

    and λ is a generalized eigenvalue.

    KFDA can extract diagnostic characteristics through mapping the original data into a high-dimensional feature space, in which the diagnostic characteristics are essentially nonlinear discriminant features in the original data space. Similar to SVM and kernel principle component analysis (KPCA), KFDA also adopts the kernel trick [23]. Its basic idea is to project an original dataset X into a high-dimensional feature space F using a nonlinear mapping f, F:χ?f(χ). The input space is implied to give a strong nonlinear discriminant with the linear discriminant so that each category is separable. Suppose the mean value of the total samples is zero after projection, the objective function based on Fisher criterion is modified to

    where the between-class scatter matrix is

    and the within-class scatter matrix is

    where

    is the mean value of the ith sub-class. Because mapping into the high-dimensional space will lead to increased computational complexity, an inner product kernel function is introduced for implicit calculation so as to extract the feature in the high-dimensional space. The inner product kernel function is defined as

    According to reproducing kernel theory, the solution of F space can be obtained by the samples in F space. Therefore, wfcan be determined by

    where αl( l= 1,2,… ,N) is the corresponding coefficients.

    Using the inner product to replace the kernel function, we have

    where

    Using Eqs. (6) and (10), the following equation can be obtained

    where Kbis calculated by

    and K is an N×N kernel function matrix from a kernel mapping and Λ = diag (Λi)i=1,2,…,Cis a N×N block diagonal matrix and each element in the matrix is 1/Ni.

    Since

    combining Eqs. (9), (10) and (12) yields

    where Kω= (KKT?KΛKT)/N .

    The Fisher linear discriminant in F can be obtained through maximizingThe solution can be obtained by solving the eigenvector of λ Kωα=Kbα .

    2.2 Improved algorithm of KFDA

    The definition of between-class scatter matrix is less than optimal based on traditional Fisher criteria. According to Eq. (1), calculating the distance between the mean of the total mapped samples and the mean of the ith sub-class mapped samples, each subclass may easily deviate from the center of the mapped sample set and resulte in sub-classes overlap. Hence, the between-class scatter matrix in Eq. (1) is redefined by calculating the distances between each subclass as

    Considering the example shown in Fig. 1, the sub-class 3 is defined as an edge class if it is far away from the sub-class 1 and sub-class 2. The projection axis A shown in Fig. 1 is obtained based on traditional feature extraction method. Apparently, because the edge sub-class 3 is excessively emphasized by its maximization between-class distance, the adjacent sub-classes overlap is resulted, as the case of subclasses 1 and 2 in Fig. 1.

    Figure 1 Different projection axes of different definition of Sb

    Accordingly, the between-class scatter matrix and the within-class scatter matrix are redefined to

    where W(Δij) and W(Δik) are the weighted functions for between-class and within-class scatter, respectively.

    The weighted scatter matrices of within-class and between-class defined in Eqs. (16) and (17) can decrease effectively the impacts of edge classes and outliers. Projection axis B is obtained by the modified scatter matrices, which has better discrimination ability than projection axis A. Similarly, the nonlinear form for the novel method is further derived using Eqs. (16) and (10). We have

    A Gaussian radial basis function is used as the inner kernel product function, defined as

    This function is widely used for its good learning ability. Furthermore, since the corresponding eigenspace is infinite, the finite sampled data must be linearly separable in this eigenspace. The weighted function is similarly defined as

    The weight is transformed by the same nonlinear transformation of the sampled data. Therefore, it can reflect the weighted relationship of the sampled data in the feature space.

    2.3 Category information measure and kernel parameters selection

    Generally, d is artificially determined. The larger d is, the more information is contained. But for the classification, remaining valid category information is the most important. After eigen transformation of the original data, the aggregation degree of the withinclasses and the dispersion degree of the betweenclasses can be evaluated by its each dimension.

    The aggregation degree of the within-classes for the kth dimensional component can be described as

    The dispersion degree of the between-classes for the kth dimensional component can be described as

    where mikand mjkare the mean value of the kth dimensional component in the ith class and the jth class, respectively.

    Combining Eqs. (23) and (24), an evaluation index is presented by the category information measurement, and defined as

    It is obvious that the smaller Jkis, the higher classification accuracy will be obtained.

    In order to reserve the most effective category information, the d characteristic components corresponding to the minimum Jkhave to be found out. Define the accumulated contribution rate of category information as

    where L is the number of selected initial eigenvectors, satisfying λL> 10λL+1. Then the first d character components satisfyingλd≥are selected whereis the set value.

    3 A MULTI-MODEL METHOD BASED ON FEATURE EXTRACTION USING WEIGHTED KERNEL FISHER CRITERION

    The weighted between-class scatter matrix and weighted within-class scatter matrix can decrease the impacts of edge classes and outliers. The category information measurement is applied to determine the number of eigenvectors and further determine the optimal feature projection matrix. The subclasses with obvious characters can be obtained by mapping the original data into the feature space. Then, the sub-models corresponding to the subclasses are established based on support vector machine (SVM), and a switcher is used to combine all SVM sub-models.

    3.1 Calculation Procedure

    A multiple model is established by combining WKFDA feature extraction and the SVM algorithm. The specific calculation procedure is summarized as follows.

    Step 1: Set up the initial parameters, such as the clustering number C and accumulated contribution rate for category information. The training data are initially classified by the fuzzy C-means clustering algorithm and the clustering number can be determined by the subtractive clustering [24].2

    Step 2: The inner product kernel parameters σ in the l group are selected within a certain range. The eigenvectors and corresponding eigenvalues are calculated based on Eqs. (18)-(20). The eigenvectors corresponding to the first maximum Mi(i = 1,2,… ,l ) eigenvalues are selected as the initial projection matrix. Step 3: After the eigen transformation, the category information measurement Jk(k = 1,… ,Mi) is calculated by Eq. (25) and effective eigenvectors di(i = 1,2,… , l) corresponding to l group kernel parameters are determined by Eq. (26) and set value. Then the Fisher criterion values corresponding to l group effective eigenvectors are calculated and the maximum Fisher value corresponding to2σ is selected as the optimal kernel parameter. The corresponding optimal projection matrix is constituted by dieigenvectors.

    Step 4: The projection transformation for training data is made by the optimal projection matrix. As to each subclass, the SVM sub-model is established by the transformed data separately.

    Step 5: A switcher is used to combine SVM sub-models and then a multi-model based soft sensor is obtained.

    After projection transformation of the original test sample, which subclass it belongs to is determined by the WKFDA classification algorithm. And then the test sample after transformation is input into the corresponding sub-model to calculate the output prediction. Fig. 2 shows the schematic of the proposed soft sensor, where S represents the switcher and Y is the predicted output.

    3.2 Example analysis

    Figure 2 Schematic of the proposed soft sensor

    Figure 3 Classification of the testing data

    To verify the classification accuracy of the proposed method, a normal dataset of wine from UCI machine learning database [25] is used for experimental analysis. The wine dataset has 178 group sample data, including 13 input attributes and 1 category attribute. One half of the wine dataset is used as training data and the rest is used as testing data. The classification results are compared with K-nearest neighbors algorithm, Bayesian classification algorithm and decision tree algorithm. The experiment is repeated for several times and their average values are taken as the experimental results, as shown in Table 1.

    Table 1 Classification results

    It is obvious that WKFDA has the highest classification accuracy. Furthermore, the sample dataset is easier to be classified after eigen transformation. The first 3 dimensions of the eigenmatrix are selected to be compared with the same dimensions of the normalized original dataset.

    It is clear from Fig. 3 that the subclasses of original dataset are overlapped and the existing methods cannot separate them clearly. The proposed method can be used for eigen transformation of the original dataset so as to find out the optimal classified projection matrix. The simulation results demonstrate that the transformed data using WKFDA is easier to distinguish and classify.

    4 INDUSTRIAL CASE STUDY

    Crystallization process is an important step in production of bisphenol A. The online phenol content, a key quality index of a dissolution tank outlet in the crystallizer, is estimated by the proposed method. The online phenol content is subject to the following 7 main operation variables: dissolving tank temperature, tank level, vacuum filter flow rate, and the contents of 4 components from the upstream unit exiting stream. There are 284 group data collected from a production site, 4/5 of which are used for training and 1/5 for testing.

    The parameters of inner kernel function are set as σ2= 2.5,= 0.8. According to the analysis for the operating condition, the number of category is C=4. KFDA and WKFDA are applied for classification and the classification results are shown in Figs. 4 and 5, respectively. From these two figures, it is obvious that the WKFDA is more effective in clustering and has better performance.

    Figure 4 Classification based on KFDA extraction of training data

    Figure 5 Classification based on WKFDA extraction of training data

    In order to verify the effectiveness of the proposed method, a single SVM model, a multiple SVM model based on KFDA and a multiple SVM model based on WKFDA are built separately so as to do simulation comparison. The kernel parameters and punishing coefficients of single SVM model, multiple SVM model based on KFDA and multiple SVM model based on WKFDA are specified as follows:

    The test errors of simulating results are shown in Table 2.

    Table 2 Test errors

    Table 2 demonstrates that the multiple SVM model based on WKFDA is better than the other two models. The corresponding output prediction results of testing data are shown in Fig. 6.

    Figure 6 Testing curves

    5 CONCLUSIONS

    A new feature extraction based on a weighted kernel Fisher criterion is proposed by modifying the definition of within-class scatter matrix and betweenclass scatter matrix. The impacts of edge classes and outliers are decreased by weighting so as to avoid subclasses overlap. The weighting method is derived with a nonlinear form so that the nonlinear problems are more effectively handled. The optimal number of projection vectors is determined by category information measurement, and the better classification results are obtained. In order to further validate the effectiveness of the proposed method, an experiment is demonstrated by a normal dataset of wine from UCI machine learning database. The method is applied to a bisphenol A production process to estimate an online quality index. The simulation results validate that the classification accuracy and estimation precision are well improved by the proposed method.

    REFERENCES

    1 Li, X.L., Su, H.Y., Chu, J., “Multiple models soft-sensing technique based on online clustering arithmetic”, Journal of Chemical Industry and Engineering, 58 (11), 2835-2839 (2007). (in Chinese)

    2 Liu, L.L., Zhou, L.F., Xie, S.G., “A novel supervised multi-model modeling method based on k-means clustering”, In: Control and Decision Conference (CCDC), Shenyang, 684-689 (2010). (in Chinese)

    3 Li, N., Li, S.Y., Xi, Y.G., “Multi-model modeling method based on satisfactory clustering”, Control Theory & Applications, 20 (5), 783-788 (2003). (in Chinese)

    4 Li, W.B., Sun, L., Zhang, D.K., “Text classification based on labeled—LDA model”, Chinese Journal of Computer, 31 (4), 621-626 (2008). (in Chinese)

    5 Yang, Q., Ding, X., “Discrminant local feature analysis with applica tions to face recognition”, Journal of Tsinghua University (Science and Technology), 44 (4), 530-533 (2004). (in Chinese)

    6 Huerta, E.B., Duval, B., Hao, J., “A hybrid LDA and genetic algorithm for gene selection and classification of microarray data”, Neurocomputing, 73 (13-15), 2375-2383 (2010).

    7 Sabatier, R., Reynès, C., “Extensions of simple component analysis and simple linear discriminant analysis”, Computational Statistics and Data Analysis, 52 (10), 4779-4789 (2008).

    8 Mohammadi, M., Raahemi, B., Akbari, A., Nassersharif, B., Moeinzadeh, H., “Improving linear discriminant analysis with artificial immune system-based evolutionary algorithms”, Information Sciences, 189, 219-232 (2012).

    9 Chen, C.K., Yang, J.Y., Yang, J., “Fusion of PCA and KFDA for face recognition”, Control and Decision, 19 (10), 1147-1154 (2004). (in Chinese)

    10 Sun, D.R., Wu, L.N., “Face recognition based on nonlinear feature extraction and SVM”, Journal of Electronics & Information Technology, 26 (2), 308-312 (2004). (in Chinese)

    11 Li, Y., Jiao, L.C., “Target recognition based on kernel Fisher discriminant”, Joural of Xidian University, 30 (4), 179-183 (2003). (in Chinese)

    12 Zheng, Y.J., Jian, Y.G., Yang, J.Y., Wu, X.J., “A reformative kernel Fisher discriminant algorithm and its application to face recognition”, Neurocomputing, 69 (13-15), 1806-1810 (2006).

    13 Sun, J.C., Li, X.H., Yang, Y., “Scaling the kernel function based on the separating boundary in input space: A data-dependent way for improving the performance of kernel methods”, Information Sciences, 184 (1), 140-154 (2012).

    14 Maldonado, S., Weber, R., Basak, J., “Simultaneous feature selection and classification using kernel-penalized support vector machines”, Information Sciences, 181 (2), 115-128 (2011).

    15 Wen, Z.M., Cairing, Z., Zhao, L., “Weighted maximum margin discriminant analysis with kernels”, Neurocomputing, 67, 357-362 (2005).

    16 Peng, X.J., “A v-twin support vector machine (v-TSVM) classifier and its geometric algorithms”, Information Sciences, 180 (2), 3863-3875 (2010).

    17 Amari, S., Wu, S., “Improving support vector machine classifiers by modifying kernel functions”, Neural Networks, 12 (6), 783-789 (1999).

    18 Kumar, M.A., Khemchandani, R., Gopal, M., Chandra, S., “Knowledge based least squares twin support vector machines”, Information Sciences, 180 (23), 4606-4618 (2010).

    19 Wu, Q., “Fault diagnosis model based on Gaussian support vector classifier machine”, Eχpert Systems with Applications, 37 (9), 6251-6256 (2010).

    20 He, Q., Wu, C.X., “Separating theorem of samples in Banach space for support vector machine learning”, International Journal of Machine Learning and Cybernetics, 184 (1), 49-54 (2011).

    21 Loog, M., Duin, R.P.W., Haeb-Umbach, R., “Multiclass linear dimension reduction by weighted pairwise Fisher criteria”, IEEE Transactions on Pattem Analysis And Machine Intelligence, 23 (7), 762-766 (2001).

    22 Li, X.H., Li, X., Guo, G.R., “Feature extraction of HRRP based on LDA algorithm”, Journal of National University of Defense Technology, 27 (5), 72-77 (2005). (in Chinese)

    23 Du, S.Q., “Methods of face recognition based on kernel Fisher discriminant”, Master Thesis, Shanxi Normal University, China (2007). (in Chinese)

    24 Zhang, X., “Research and application of FCM initialization method”, Master Thesis, Southwest University, China (2006). (in Chinese)

    25 Frank, A., Asuncion, A., “UCI machine learning repository”, 2010, http://archive.ics.uci.edu/ml.

    Received 2012-08-10, accepted 2012-10-06.

    * Supported by the National Natural Science Foundation of China (61273070) and the Foundation of Priority Academic Program Development of Jiangsu Higher Education Institutions.

    ** To whom correspondence should be addressed. E-mail: yhz@jiangnan.edu.cn

    猜你喜歡
    楊慧
    相遇那一刻,做個(gè)溫柔的人
    37°女人(2019年5期)2019-05-11 02:44:12
    回家
    在你懷里
    睡覺是黑的
    我喜歡坐公交車
    隱匿在愧疚深處:那個(gè)“愛情叛徒”一直在擔(dān)當(dāng)
    工地剪影
    金秋(2017年14期)2017-10-24 09:14:43
    β—內(nèi)酰胺類抗生素中聚合物檢測方法的研究
    我的世界有你來過
    37°女人(2016年2期)2016-09-26 00:54:11
    一本—道久久a久久精品蜜桃钙片 精品乱码久久久久久99久播 | 在线免费观看的www视频| 日本免费a在线| 亚洲精品影视一区二区三区av| 麻豆精品久久久久久蜜桃| 国产一区有黄有色的免费视频 | 国语自产精品视频在线第100页| 国产三级中文精品| 淫秽高清视频在线观看| 人妻系列 视频| 国产探花在线观看一区二区| 亚洲av日韩在线播放| 国产成人91sexporn| 久久6这里有精品| 免费在线观看成人毛片| 日韩欧美在线乱码| 在线观看av片永久免费下载| 亚洲av成人av| 国产精品久久视频播放| 亚洲图色成人| 国产精品国产高清国产av| 日韩欧美精品v在线| av又黄又爽大尺度在线免费看 | 一级毛片aaaaaa免费看小| 国产av码专区亚洲av| 男女下面进入的视频免费午夜| 国产精品女同一区二区软件| 麻豆乱淫一区二区| 久久久欧美国产精品| 久久久亚洲精品成人影院| 国产成人精品婷婷| 国产成人aa在线观看| 国产精品电影一区二区三区| 国产高潮美女av| 亚州av有码| 亚洲国产最新在线播放| 真实男女啪啪啪动态图| 成人亚洲欧美一区二区av| 日韩精品有码人妻一区| 能在线免费看毛片的网站| 99热这里只有精品一区| 在线观看66精品国产| 亚洲精品日韩av片在线观看| 日日干狠狠操夜夜爽| 成人鲁丝片一二三区免费| kizo精华| 精华霜和精华液先用哪个| 在线观看美女被高潮喷水网站| 麻豆精品久久久久久蜜桃| 亚洲精华国产精华液的使用体验| ponron亚洲| 久久精品影院6| 亚洲av免费高清在线观看| 国产亚洲av片在线观看秒播厂 | 精品人妻偷拍中文字幕| 亚洲精品国产成人久久av| 简卡轻食公司| 欧美潮喷喷水| 日韩欧美精品v在线| 日本爱情动作片www.在线观看| 美女高潮的动态| 国产伦理片在线播放av一区| 3wmmmm亚洲av在线观看| av国产久精品久网站免费入址| 高清毛片免费看| 国产精品一区二区三区四区免费观看| 色噜噜av男人的天堂激情| 一级毛片久久久久久久久女| 免费看日本二区| 国产免费男女视频| 99久久精品热视频| 2021少妇久久久久久久久久久| 观看免费一级毛片| 全区人妻精品视频| 色综合站精品国产| 成人二区视频| 女人久久www免费人成看片 | 国产高清三级在线| 99视频精品全部免费 在线| 美女国产视频在线观看| 国产精品久久久久久精品电影小说 | 欧美人与善性xxx| 亚洲精品自拍成人| 校园人妻丝袜中文字幕| 欧美高清性xxxxhd video| 人妻制服诱惑在线中文字幕| 亚洲av男天堂| 久久久久网色| 亚洲成av人片在线播放无| 国产精品综合久久久久久久免费| 一边亲一边摸免费视频| 麻豆乱淫一区二区| 色综合色国产| 国产欧美另类精品又又久久亚洲欧美| 亚洲欧美一区二区三区国产| 91精品一卡2卡3卡4卡| 婷婷色麻豆天堂久久 | 久久久久国产网址| 国产精品久久久久久精品电影| 91精品伊人久久大香线蕉| 中文字幕免费在线视频6| 欧美不卡视频在线免费观看| 精品国产三级普通话版| 久久国内精品自在自线图片| 国产在线男女| 久久国产乱子免费精品| 日韩av在线大香蕉| 天堂网av新在线| 纵有疾风起免费观看全集完整版 | 一夜夜www| 亚洲精品456在线播放app| 欧美激情国产日韩精品一区| 成人三级黄色视频| 日本爱情动作片www.在线观看| 久久久久久久国产电影| 国产午夜福利久久久久久| 综合色丁香网| 韩国av在线不卡| 日韩成人伦理影院| 男女下面进入的视频免费午夜| 性插视频无遮挡在线免费观看| 99久久成人亚洲精品观看| 亚洲中文字幕日韩| 成人特级av手机在线观看| 99热6这里只有精品| 中国美白少妇内射xxxbb| 日韩欧美 国产精品| 国产高清视频在线观看网站| 建设人人有责人人尽责人人享有的 | 日本免费一区二区三区高清不卡| 一个人免费在线观看电影| 国产v大片淫在线免费观看| 大香蕉久久网| 精品午夜福利在线看| av女优亚洲男人天堂| a级一级毛片免费在线观看| 欧美又色又爽又黄视频| 偷拍熟女少妇极品色| 极品教师在线视频| 欧美最新免费一区二区三区| 国产精品三级大全| 一卡2卡三卡四卡精品乱码亚洲| 内射极品少妇av片p| 最新中文字幕久久久久| 老司机影院成人| 久久久精品大字幕| 国产午夜精品久久久久久一区二区三区| 成人特级av手机在线观看| 啦啦啦啦在线视频资源| .国产精品久久| 久久久久久久久久黄片| 免费在线观看成人毛片| 欧美最新免费一区二区三区| 男女国产视频网站| 日韩 亚洲 欧美在线| 国产 一区 欧美 日韩| 午夜久久久久精精品| 最近视频中文字幕2019在线8| 一级毛片我不卡| 好男人在线观看高清免费视频| 熟女人妻精品中文字幕| 99热这里只有是精品50| 午夜福利在线观看免费完整高清在| 丰满人妻一区二区三区视频av| 欧美一区二区亚洲| 亚洲国产最新在线播放| 丰满少妇做爰视频| 成人av在线播放网站| 特大巨黑吊av在线直播| 中文字幕久久专区| 精品国产一区二区三区久久久樱花 | 大香蕉97超碰在线| 国产v大片淫在线免费观看| 久久精品熟女亚洲av麻豆精品 | 免费看av在线观看网站| 少妇的逼水好多| 22中文网久久字幕| 久久久久久久午夜电影| 亚洲国产最新在线播放| 国产精品国产三级国产专区5o | 国产不卡一卡二| 国产免费视频播放在线视频 | 日韩欧美在线乱码| 免费观看在线日韩| 日韩中字成人| 综合色av麻豆| 国产片特级美女逼逼视频| 天堂影院成人在线观看| 国产精品一区二区三区四区免费观看| 日日啪夜夜撸| 欧美一区二区亚洲| 国产精品伦人一区二区| 国内精品一区二区在线观看| 99久久中文字幕三级久久日本| 国内精品宾馆在线| 精品久久久噜噜| 精品国产露脸久久av麻豆 | 99热全是精品| 1024手机看黄色片| 国产成人精品久久久久久| 人妻夜夜爽99麻豆av| 国产精品av视频在线免费观看| 超碰av人人做人人爽久久| 最近手机中文字幕大全| 久久精品国产亚洲av天美| 国产精品国产高清国产av| 久久99精品国语久久久| av在线播放精品| 久久久国产成人精品二区| 久久久精品大字幕| 成人午夜精彩视频在线观看| 夜夜看夜夜爽夜夜摸| www日本黄色视频网| 精品久久久久久电影网 | 国产高清三级在线| 少妇熟女aⅴ在线视频| 亚洲欧美精品自产自拍| 26uuu在线亚洲综合色| 综合色丁香网| 色播亚洲综合网| 国产精品蜜桃在线观看| 国产大屁股一区二区在线视频| 国产伦理片在线播放av一区| 小说图片视频综合网站| 国产一区二区在线观看日韩| 中文字幕熟女人妻在线| 欧美成人a在线观看| 特大巨黑吊av在线直播| 国产欧美日韩精品一区二区| 蜜桃久久精品国产亚洲av| 久久久久久久国产电影| 伊人久久精品亚洲午夜| 天堂中文最新版在线下载 | av国产久精品久网站免费入址| 精品久久久久久久久久久久久| 中文字幕免费在线视频6| 99久久精品一区二区三区| 午夜福利在线在线| or卡值多少钱| 男人狂女人下面高潮的视频| av天堂中文字幕网| 国产精品一区二区三区四区久久| 亚洲欧美日韩卡通动漫| 亚洲熟妇中文字幕五十中出| 亚洲精品乱久久久久久| 国产老妇伦熟女老妇高清| 99久国产av精品| 国产伦在线观看视频一区| 日本免费在线观看一区| 精品酒店卫生间| 两个人的视频大全免费| 午夜a级毛片| 亚洲综合精品二区| 蜜桃久久精品国产亚洲av| 国产成人精品久久久久久| 亚洲av.av天堂| 一区二区三区免费毛片| 久久久a久久爽久久v久久| 搞女人的毛片| 国产爱豆传媒在线观看| 国产精品99久久久久久久久| 午夜福利在线观看免费完整高清在| av播播在线观看一区| 九九热线精品视视频播放| 国产精品久久久久久精品电影小说 | 秋霞在线观看毛片| 国产成人午夜福利电影在线观看| 国产淫语在线视频| 国内少妇人妻偷人精品xxx网站| 高清av免费在线| 国产中年淑女户外野战色| 久久久久免费精品人妻一区二区| 亚洲av成人av| 老女人水多毛片| 成人三级黄色视频| 热99re8久久精品国产| 免费无遮挡裸体视频| 一级毛片电影观看 | 亚洲国产精品sss在线观看| 国产老妇伦熟女老妇高清| 成年版毛片免费区| 搡老妇女老女人老熟妇| 日韩成人伦理影院| 成人特级av手机在线观看| 国产一区有黄有色的免费视频 | 久久人人爽人人爽人人片va| 亚洲久久久久久中文字幕| 亚洲激情五月婷婷啪啪| 欧美色视频一区免费| 欧美不卡视频在线免费观看| 综合色丁香网| 久久久久免费精品人妻一区二区| 淫秽高清视频在线观看| 啦啦啦韩国在线观看视频| 国产精品av视频在线免费观看| 日韩一区二区三区影片| 日韩av在线免费看完整版不卡| 一区二区三区乱码不卡18| 欧美+日韩+精品| 床上黄色一级片| 亚洲国产精品久久男人天堂| av国产免费在线观看| 亚洲人与动物交配视频| 麻豆久久精品国产亚洲av| 亚洲aⅴ乱码一区二区在线播放| 中文字幕av在线有码专区| 丰满人妻一区二区三区视频av| 如何舔出高潮| 日本与韩国留学比较| 国产人妻一区二区三区在| 精品国产一区二区三区久久久樱花 | 国产片特级美女逼逼视频| 亚洲av二区三区四区| 青春草亚洲视频在线观看| 午夜精品一区二区三区免费看| 亚洲欧洲国产日韩| 国产视频内射| 亚洲激情五月婷婷啪啪| 午夜爱爱视频在线播放| 午夜福利在线观看免费完整高清在| 亚洲aⅴ乱码一区二区在线播放| 91精品一卡2卡3卡4卡| 国产av不卡久久| 久久精品国产亚洲av天美| 欧美日韩在线观看h| 国产成人aa在线观看| 高清午夜精品一区二区三区| 国模一区二区三区四区视频| 一个人看的www免费观看视频| 欧美性猛交╳xxx乱大交人| 国产色婷婷99| 久久久成人免费电影| 国产v大片淫在线免费观看| 神马国产精品三级电影在线观看| 熟女人妻精品中文字幕| 一区二区三区高清视频在线| 国内精品宾馆在线| 日本-黄色视频高清免费观看| 少妇猛男粗大的猛烈进出视频 | 可以在线观看毛片的网站| 色综合亚洲欧美另类图片| 特大巨黑吊av在线直播| av播播在线观看一区| 欧美xxxx性猛交bbbb| 国产成人精品久久久久久| 一区二区三区免费毛片| 老司机影院毛片| 一区二区三区四区激情视频| 亚洲在久久综合| 国产精品日韩av在线免费观看| 久久久久久久亚洲中文字幕| 亚洲av.av天堂| 免费黄网站久久成人精品| 国产三级在线视频| 少妇熟女欧美另类| 男的添女的下面高潮视频| 插阴视频在线观看视频| h日本视频在线播放| 狠狠狠狠99中文字幕| 午夜视频国产福利| 亚洲国产欧美人成| 国产伦精品一区二区三区四那| 五月玫瑰六月丁香| 国产成人精品一,二区| 国产私拍福利视频在线观看| 午夜久久久久精精品| 久久99热这里只有精品18| 欧美成人精品欧美一级黄| 欧美日韩在线观看h| 色视频www国产| 亚洲精品日韩在线中文字幕| 99热6这里只有精品| 久久久亚洲精品成人影院| 日韩欧美在线乱码| 舔av片在线| 看黄色毛片网站| 一边亲一边摸免费视频| 99久久精品一区二区三区| 少妇的逼好多水| 夜夜爽夜夜爽视频| 国产高清视频在线观看网站| 国产在视频线在精品| 草草在线视频免费看| 如何舔出高潮| 久久99蜜桃精品久久| 最后的刺客免费高清国语| 国产免费男女视频| av在线播放精品| 99热精品在线国产| 日本wwww免费看| 久久久久久久久久久丰满| 男女下面进入的视频免费午夜| 欧美极品一区二区三区四区| 亚洲怡红院男人天堂| 久久精品国产亚洲av天美| 亚洲国产精品专区欧美| 国产色爽女视频免费观看| 亚洲av男天堂| 成人美女网站在线观看视频| 亚洲熟妇中文字幕五十中出| 99久久精品一区二区三区| 国产精品三级大全| 日韩欧美在线乱码| 一夜夜www| 久久鲁丝午夜福利片| 日日干狠狠操夜夜爽| 黄色配什么色好看| 亚洲婷婷狠狠爱综合网| 人体艺术视频欧美日本| 男插女下体视频免费在线播放| 夜夜看夜夜爽夜夜摸| 国产精品嫩草影院av在线观看| av在线蜜桃| 亚洲国产精品成人久久小说| 91午夜精品亚洲一区二区三区| 在线免费观看不下载黄p国产| 午夜日本视频在线| 秋霞在线观看毛片| 久久国产乱子免费精品| 免费黄网站久久成人精品| 久久婷婷人人爽人人干人人爱| 国产在线一区二区三区精 | 欧美成人精品欧美一级黄| 婷婷色综合大香蕉| 亚洲图色成人| 久久久久久久久大av| 国产在视频线在精品| 亚洲精品乱久久久久久| 国产欧美日韩精品一区二区| 国产精品女同一区二区软件| 一个人看视频在线观看www免费| 搡老妇女老女人老熟妇| .国产精品久久| 久久久久九九精品影院| 人妻夜夜爽99麻豆av| 亚洲成人久久爱视频| 久久精品人妻少妇| 一级av片app| 婷婷色综合大香蕉| 精品人妻一区二区三区麻豆| 色5月婷婷丁香| 国产精品三级大全| 国产伦一二天堂av在线观看| 久久精品久久精品一区二区三区| 国产精品一区二区三区四区免费观看| 能在线免费看毛片的网站| 精品一区二区三区人妻视频| 日韩 亚洲 欧美在线| 国产亚洲5aaaaa淫片| 国产一级毛片在线| 亚洲怡红院男人天堂| 天堂影院成人在线观看| 国产亚洲91精品色在线| 欧美变态另类bdsm刘玥| 综合色丁香网| 亚洲经典国产精华液单| 免费观看的影片在线观看| 国内精品美女久久久久久| 亚洲国产精品成人久久小说| 人人妻人人澡欧美一区二区| 亚洲图色成人| 国产成人精品婷婷| 亚洲成av人片在线播放无| 一夜夜www| 成人亚洲欧美一区二区av| 一本一本综合久久| 久久欧美精品欧美久久欧美| 亚洲久久久久久中文字幕| av免费在线看不卡| 一级毛片aaaaaa免费看小| 青春草亚洲视频在线观看| 99久久九九国产精品国产免费| 亚洲婷婷狠狠爱综合网| 午夜福利网站1000一区二区三区| 国产精品一二三区在线看| 精品国产三级普通话版| 午夜精品一区二区三区免费看| 久久久久久大精品| 两性午夜刺激爽爽歪歪视频在线观看| 日韩欧美精品免费久久| 天天躁夜夜躁狠狠久久av| 青春草国产在线视频| 搡女人真爽免费视频火全软件| 观看免费一级毛片| 国产一区二区亚洲精品在线观看| 一个人看视频在线观看www免费| 亚洲国产最新在线播放| 国产爱豆传媒在线观看| 超碰97精品在线观看| 午夜a级毛片| 欧美97在线视频| 中国国产av一级| 伦理电影大哥的女人| 午夜免费激情av| 亚洲精品亚洲一区二区| 中文字幕亚洲精品专区| 国产精品国产三级国产av玫瑰| 日韩欧美在线乱码| 男女视频在线观看网站免费| av在线蜜桃| 桃色一区二区三区在线观看| 亚洲av男天堂| 国产在线男女| 国产高清有码在线观看视频| 中文亚洲av片在线观看爽| 一级黄片播放器| 美女大奶头视频| 亚洲天堂国产精品一区在线| 97在线视频观看| 伦精品一区二区三区| 亚洲高清免费不卡视频| 欧美精品一区二区大全| 久久草成人影院| 亚洲av男天堂| 成人午夜精彩视频在线观看| 欧美zozozo另类| 午夜福利网站1000一区二区三区| 一级av片app| 国产私拍福利视频在线观看| 欧美高清成人免费视频www| 日日摸夜夜添夜夜爱| 国产精品久久久久久久电影| 狂野欧美白嫩少妇大欣赏| 婷婷色综合大香蕉| 久久久久久久亚洲中文字幕| 我要看日韩黄色一级片| .国产精品久久| 熟女人妻精品中文字幕| 色噜噜av男人的天堂激情| 99热精品在线国产| 天美传媒精品一区二区| 在线天堂最新版资源| 欧美xxxx性猛交bbbb| 国产一区二区在线观看日韩| 国产日韩欧美在线精品| 欧美一区二区亚洲| 欧美区成人在线视频| 51国产日韩欧美| eeuss影院久久| 国产免费又黄又爽又色| 日本爱情动作片www.在线观看| 日本黄色视频三级网站网址| 国产亚洲一区二区精品| 99热网站在线观看| 亚洲成av人片在线播放无| 性插视频无遮挡在线免费观看| 边亲边吃奶的免费视频| 91精品国产九色| 日韩三级伦理在线观看| 国产伦在线观看视频一区| 性色avwww在线观看| 国产爱豆传媒在线观看| 成年av动漫网址| 国产国拍精品亚洲av在线观看| 久久精品国产亚洲av涩爱| 亚洲精品国产av成人精品| 村上凉子中文字幕在线| 真实男女啪啪啪动态图| 亚洲av一区综合| 欧美区成人在线视频| 99热6这里只有精品| 国产精品av视频在线免费观看| 国产欧美日韩精品一区二区| 一二三四中文在线观看免费高清| 纵有疾风起免费观看全集完整版 | 又爽又黄a免费视频| 亚洲最大成人av| 亚洲成色77777| 最近2019中文字幕mv第一页| 亚洲av成人精品一二三区| 国产精品野战在线观看| 女人十人毛片免费观看3o分钟| 成人毛片60女人毛片免费| 午夜精品国产一区二区电影 | 天天躁夜夜躁狠狠久久av| 欧美成人午夜免费资源| 一级毛片电影观看 | 岛国在线免费视频观看| 两个人的视频大全免费| 特级一级黄色大片| 老司机影院成人| 久久精品久久久久久噜噜老黄 | 一夜夜www| 精品一区二区三区人妻视频| 国产精品蜜桃在线观看| 九色成人免费人妻av| 看免费成人av毛片| 99在线视频只有这里精品首页| eeuss影院久久| 免费一级毛片在线播放高清视频| 性插视频无遮挡在线免费观看| 免费看a级黄色片| 99热这里只有是精品50| 日日干狠狠操夜夜爽| 久久热精品热| av黄色大香蕉| 天美传媒精品一区二区| 中文天堂在线官网| 国产精品久久电影中文字幕| 国产亚洲最大av| 欧美区成人在线视频| 免费看av在线观看网站| 麻豆国产97在线/欧美| 人人妻人人澡人人爽人人夜夜 | 精品人妻一区二区三区麻豆| 欧美性感艳星| 波野结衣二区三区在线| 国产视频内射| 床上黄色一级片| av天堂中文字幕网| 美女高潮的动态| 男女边吃奶边做爰视频| 欧美日韩综合久久久久久| 成人鲁丝片一二三区免费| 男女边吃奶边做爰视频|