• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Latent Variable Regression for Supervised Modeling and Monitoring

    2020-05-21 05:44:38QinqinZhu
    IEEE/CAA Journal of Automatica Sinica 2020年3期

    Qinqin Zhu

    Abstract—A latent variable regression algorithm with a regularization term (rLVR) is proposed in this paper to extract latent relations between process data X and quality data Y. In rLVR,the prediction error between X and Y is minimized, which is proved to be equivalent to maximizing the projection of quality variables in the latent space. The geometric properties and model relations of rLVR are analyzed, and the geometric and theoretical relations among rLVR, partial least squares, and canonical correlation analysis are also presented. The rLVR-based monitoring framework is developed to monitor process-relevant and quality-relevant variations simultaneously. The prediction and monitoring effectiveness of rLVR algorithm is demonstrated through both numerical simulations and the Tennessee Eastman(TE) process.

    I. Introduction

    IN industrial processes, timely process monitoring is of great importance, which helps detect potential hazards and enhance operation safety in the processes, thus contributing substantially to tomorrow’s industry and imparting significant economic benefits. Traditionally, the routine examinations by experienced personnel were the major approach to detect anomalies, which, however, is prone to error and not completely reliable and comprehensive. With the advancement of technologies in data collection, transmission and storage, a new effective monitoring scheme based on multivariate analytical methods has emerged to track variations in the process in a timely and reliable fashion, and it is widely applied in chemical engineering, biology, pharmaceutical engineering, and management science [1]–[6]. Among them, principal component analysis (PCA), partial least squares (PLS) and canonical correlation analysis (CCA) are three popular and effective algorithms used in multivariate process monitoring.

    PCA is a powerful tool to discover important patterns and reduce the dimension of process data, and it decomposes the original process space into the principal component subspace(PCS) with large variances and the residual subspace (RS)which mainly contains noise [7]. The monitoring scheme based on PCA is well defined to detect anomalies in PCS withT2statistic and those in RS withQstatistic [1], [8]. In industrial processes, product quality is of major concern.PCA-based monitoring scheme, however, fails to build the connection betweenXand quality variablesY, and the information ofYis not available in both modeling and monitoring stages, which makes it hard to identify whether the faulty samples will affect product quality. Thus, supervised algorithms such as PLS and CCA are preferred.

    PLS extracts the latent variables by maximizing the covariance betweenXandY, thus quality information is successfully captured in the latent model. Since PLS pays attention to both process and quality variables, the captured latent variables contain variations that are orthogonal or irrelevant toY, and further decomposition is necessary for comprehensive modeling of PLS [9]–[11]. Another issue involved in PLS is that its objectives of outer and inner modeling are different: the outer model is to maximize covariance betweenXandY, while inner model tries to minimize regression errors between process and quality scores.The discrepancy reduces the prediction efficiency of PLS.

    Similar to PLS, CCA works as a supervised modeling algorithm to construct its latent space with the supervision of quality variables. The latent variables are extracted by maximizing the correlation betweenXandY, and all the information contained in the latent variables is relevant toY,thus CCA can get rid of the effect of process variances and retain a better prediction performance than PLS. The remaining information left in the process and quality variables may still be valuable to reveal abnormal variations in the data,which can contribute to the improvement of operation safety and economic efficiency; however, these variations remain unexploited in CCA, and concurrent CCA was proposed to conduct a full decomposition onXandY[12].

    T2andQstatistics are employed to monitor variations in PCS and RS subspaces for PLS and CCA as well, which obtain satisfactory performance [8], [13]–[15]. Distributed process monitoring frameworks are also developed for plantwide processes [16]–[18].

    In this paper, we propose a new regularized latent variable regression (rLVR) algorithm to build the relation between process and quality data, which is designed to have consistent outer and inner modeling objectives. Different from PLS and CCA, rLVR pays attention to both correlation betweenXandYand the variance ofY, which achieves better prediction power.The geometric properties and model relations of rLVR are analyzed, which reveals the orthogonality of extracted latent variables. An rLVR-based monitoring framework is also developed to monitor process-relevant and quality-relevant variations.

    The remainder of this paper is organized as follows. In Section II, the traditional latent variable methods, PLS and CCA, are reviewed. Section III presents the motivation and details of regularized LVR algorithm. The geometric properties and model relations of rLVR and its equivalence with PLS and CCA are demonstrated in Section IV. The comprehensive monitoring scheme based on rLVR is developed in Section V. Section VI employs two case studies,a numerical simulation and the Tennessee Eastman process, to illustrate the effectiveness of rLVR over PLS and CCA from prediction and monitoring perspectives. The conclusions are drawn in the last section.

    II. Latent Structured Methods

    A. Projection to Latent Structures

    Projection to latent structures (PLS), which is also referred to as partial least squares, is a typical supervised dimensionality reduction algorithm. It constructs a lower-dimensional space by maximizing the covariance between the projections of process dataX∈Rn×mand quality dataY∈Rn×pin the latent space,wherenis number of training samples, andmandpare number of process variables and quality variables respectively. The objective of PLS is mathematically presented as

    where score vectorst∈Rnandu∈Rnare projections ofXandYon the latent space respectively, and weighting vectorsw∈Rmandq∈Rpare projecting directions. Equation (1) is also called the outer modeling objective of PLS, and its solution can be derived iteratively with the aid of Lagrange multipliers.

    In inner modeling, PLS builds a linear regression model betweentandu,

    wherebis regression coefficient, and ? is modeling error. The first part in the above equation denotes the prediction of input score, and the regression coefficientbcan be obtained by minimizing the modeling error betweentandt?

    B. Canonical Correlation Analysis

    Canonical correlation analysis (CCA) [21], also known as canonical variate analysis (CVA), works as a counterpart of PLS to extract latent structures by maximizing the correlation betweenXandY, whose objective is formulated as

    or equivalently,

    In contrast to the iterative way in PLS, the latent variables of CCA can be derived directly by singular value decomposition (SVD). The detailed procedure of CCA is summarized as

    1) Pre-processXandYto make them zero-mean and unitvariance;

    2) Perform SVD on scaledXandY,W=[w1,w2,...,wl]andQ=[q1,q2,...,ql],

    3) Perform SVD to calculate the weighting matrices

    The deflation of CCA can be obtained similar to PLS by minimizing | |X?TPT|| a nd | |Y?TCT||, leading to

    whereT=XW, andPandCare loading matrices for process and quality variables in CCA.

    It is known that CCA is sensitive to noise in the presense of strong collinearity. Thus, regularized CCA (rCCA) was proposed to address the ill-conditioned performance by designing two regularization terms in both process and quality sides [12].

    C. Several Notes for PLS and CCA

    In (1), PLS pays attention to both correlation betweenXandYand their variances, and the extracted latent variables contain irrelevant or orthogonal information, which makes no contributions to predictingY. As a consequence, PLS needs superfluous number of latent variables. For instance, multiple latent variables are required to predict only one-dimensional quality dataY. Thus, further decomposition is designed in the subsequent works, such as total PLS [9] and concurrent PLS[10]. Another issue involved with PLS is the inconsistent objectives for outer and inner modeling, as observed in (1) and(2).

    CCA achieves better prediction or modeling performance by focusing on correlation between process and quality variables only. However, CCA attaches equal importance to process and quality variables, and the modeling performance can be further improved by incorporating the variances and signals of quality variablesY.

    Therefore, motivated by the aforementioned analysis, we propose the latent variable regression (LVR) method in the next section [22].

    III. Latent Variable Regression

    A. Outer Model

    In LVR, in order to make inner and outer modeling consistent, the following outer relation is designed.

    where the symbols have the meaning as in (1). It is noted that the constraint in (5) is different from those for PLS, which is designed on purpose and will be explained in the following subsections.

    The solution of (5) can be obtained with Lagrange multiplier λqas follows:

    Taking derivative with respect towandqand setting the results to zero result in

    By re-arranging the above equations, we have

    wherewTandqTare pre-multiplied in (6) and (7), which leads to

    Lemma 1:The least squares objective for latent variable regression in (5) is equivalent to minimizing the Lagrange multiplier λq.

    Proof:In (5),Joutercan be expanded as

    Thus, minimizing the prediction error between projection scorestanduin LVR is equivalent to finding the minimum

    B. Inner Model

    Equations (8) and (9) constitute the outer modeling of LVR.For inner modeling, the same objective is applied; that is to minimize the least squares,

    which leads to

    Remark 1:The inner modeling is not needed in latent variable regression method.

    It is noted that Remark 1 is an expected result due to the same outer and inner objectives in (5) and (10).

    C. Deflation of XandY

    Deflation ofXandYis performed to remove the effects of extracted latent variables, which can be represented as

    wherep∈Rmandc∈Rpare loading vectors forXandYrespectively, and they can be calculated by minimizing the regression errors | |X?t pT||2and | |Y?tcT||2, leading to

    Therefore, the procedure to extract latent variables in LVR is summarized as follows.

    1) Scale process and quality dataXandYto zero mean and unit variance.

    2) Initializeuas the first column ofY, and repeat the following relations until convergence is achieved.

    3) DeflateXandYby

    4) Conduct Steps 2 and 3 for next round untilllatent variables are extracted, wherelis determined by cross validation.

    D. Regularized Latent Variable Regression

    In LVR, inversion of the covariance ofXis involved to calculate weighting vectorw, and the collinearity inXwill lead to inconsistent results, which is shown in Fig. 1. In Fig. 1(a), two columns ofXare tightly correlated, and since the angle betweenx1andx2is not zero, Plane 1 is formed. In this case, quality variableyis able to project onto Plane 1, and the projection isy′.However, in most cases, data is subject to noise, which will makex1deviate from its original direction, andis the resulting direction.In Fig. 1(b), the new plane, Plane 2,defined byandx2drastically diverges from Plane 1,and the new projection y? in Plane 2 is also different fromy′. As concluded from Fig. 1, when the data are strongly collinear, the results of LVR are not reliable and consistent. Thus, it is necessary to address the collinear issues, which can be achieved by constraining the norm ofwwith a regularization term.

    Lemma 2:The LVR objective in (5) is equivalent to the following objective [23]:

    Fig. 1. Ill-conditioned performance caused by collinearity. (a) Projection of y ; b ) Large deviation of projection of y caused by noise.

    The proof details are given in Appendix A. The new objective in (13) for LVR is similar to the formulation for PLS in (1) or for CCA in (4); however, due to the different constraints, the derived solutions and their geometric properties are different. Since the objective in (13) is more conventional in process monitoring, it is adopted to develop the regularized LVR (rLVR) algorithm.

    A regularization term is designed in rLVR as follows to address the strong collinearity [22].

    where γ is regularized parameter.

    The detailed rLVR algorithm is summarized in Algorithm 1.There are two parameters involved in Algorithm 1, which areland κ, and they can be determined jointly with a cross validation.

    Algorithm 1 Regularized Latent Variable Regression

    IV. Geometric Properties and Model Relations

    A. rLVR Geometric Properties

    PCA, PLS and CCA extract their latent variables by maximizing a statistic metric (variance, covariance or correlation), and their geometric properties are well studied[2], [13], [24]. The idea of rLVR algorithm is different from PCA, PLS and CCA, and it is important to understand the structures of its latent space for further applications.

    For ease of illustration, a subscriptiis used to denote the iteration order. For instance,tide notes theith latent score,XiandYiare deflated process and quality datasets inith extraction round, and we have

    Then, with this denotation and the relations in Algorithm 1,we have the following lemma.

    Lemma 2:We have the following orthogonal properties between residuals and model parameters in regularized LVR algorithm:

    The proof of Lemma 2 is given in Appendix B. With the orthogonal relations in Lemma 2, it is straightforward to derive the orthogonality among model scores, weights and loadings, which is summarized in the following theorem.

    Theorem 1:The following orthogonal geometric properties hold for regularized LVR.

    Proof:To prove Relations 1 and 2, thepexpression in (15)is utilized

    For Relation 3, assuming thati>j, then

    wheni

    Additionally, from the Lagrange relations of (14), we have

    where λwand λqare Lagrange multipliers forwiandqirespectively. Thus, we have

    wherecis normalization coefficient. With the relations in Lemma 2 and assumingi>j

    Simil arly, whenTherefore, Relation 4 is proved.

    Theorem 1 shows that the scorestiare mutually orthogonal,and the deflation of process and quality datasets can be represented as

    Equation (20) implies that in order to calculatewandqin(6) and (7), only one dataset needs to be deflated for further iterations.

    B. rLVR Model Relations

    After performing rLVR, the process and quality data can be predicted by

    whereandQ=[q1,q2,...,ql]. BothPandQare available from training stage, whileTvaries with process dataX. Thus, it is necessary to derive the explicit relation betweenXandT.

    According to (39) in Appendix B,Xican be re-arranged into

    whereThen each score vectortiinTis calculated by

    whereri≡N1:i?1wi. With Relations 1 and 2 in Theorem 1, it is easily to show that

    whereR=[r1,r2,...,rl], andW=[w1,w2,...,wl]. Therefore,Tand predictions ofXandYcan be calculated from process data directly.

    The following properties hold forRandP.

    Lemma 3:PRTandI?PRTare idempotent matrices. That is

    The proof of Lemma 3 is provided in Appendix C. Lemma 3 demonstrates that bothPRTandI?PRTare orthogonal projection matrices.

    In online prediction, the prediction of quality data is calculated from the new samplexdirectly.

    and the new sampleXcan be modeled as

    Theorem 2:Regularized latent variable regression algorithm

    Proof:From (27), we have

    with Lemma 1, the first item in (29) is

    Similarly, the second item is

    Therefore

    C. Relation Among PLS, CCA and LVR

    A generalized formulation of PLS, CCA [21] and LVR can be derived as [25]

    where 0 ≤αw,αq≤1.

    PLS, CCA and LVR are three special cases of (30):

    1) When αw=1 and αq=1, (30) reduces to PLS;

    2) When αw=0 and αq=0, (30) reduces to CCA, and the constraints of ||w||=1 and ||q|| = 1 are equivalent to adding a regularization term forXandYrespectively;

    3) When αw=0 and αq=1, (30) stands for LVR, and the regularization term is incorporated with the extra constraint||w||=1.

    Geometrically, for ease of comparison, the objective of PLS, CCA and LVR are re-arranged as

    where θ is the angle betweenuandt, and for simplicity, the regularization term is omitted in the discussion for LVR. The geometric relations among PLS, CCA and LVR are presented in Fig. 2.

    Fig. 2. The geometric relations among PLS, CCA and LVR.

    As discussed in Section II, in addition to the relation between scoresuandt, PLS also emphasizes the variances ofXandYas shown in Fig. 2, and the extracted latent variables obtains less effective prediction power.

    In contrast, CCA is to maximize the correlation between the projections ofXandYin the latent space, thus it only focuses on the angle betweenuandt. CCA works well for prediction;however, since the variances of process and quality spaces are not exploited, further decompositions are necessary for good monitoring performance [12].

    The proposed LVR algorithm maximizes the projections of quality scores on the latent space, and both variance ofYand the angle betweenuandtare considered, leading to a better prediction effectiveness.

    V. Process Monitoring With rLVR

    It is important to develop a monitoring system based on the extracted latent variables in rLVR to detect anomalies in both principal component subspace (PCS) and residual subspace(RS).

    The variations in PCS are relevant to quality variables, and contain large variances. Assuming that they are normally distributed,T2index can be utilized to monitor qualityrelevant variations in this subspace [1]. For a new samplex,itsT2is calculated by

    wheret=RT x, and Λ=1/(n?1)TTTis a diagonal matrix.The threshold forT2can be defined as

    whereFl,n?l,αdenotes anF-distribution withlandn?ldegrees of freedom, and α defines the confidence interval.

    The information contained in the residual space is not related to quality variables, but it is still beneficial to monitor the variations in RS for the sake of operation efficiency and safety. It is not appropriate to useQstatistic directly as in PCA [1], [2], since the variances in the RS space may still be very large. Thus, a subsequent PCA decomposition is applied in the residualto extract the latent structure in RS space.

    wher eandcan be monitored withT2andQindices respectively.

    The detailed monitoring statistics and their corresponding thresholds are summarized in Table I, and the monitoring scheme is as follows.

    1)IfT2exceeds its control limit, a quality-relevant fault isdetected with ( 1?α)×100% confidence.

    TABLE I Monitoring Statistics and Control Limits

    2) Ifis larger than its control limit, a process-relevant fault is detected with (1?α)×100% confidence, and the fault will not affect quality variables.

    3) IfQrexceeds its threshold, the relation between process and quality variables might be broken, which needs further investigation.

    Remark 2:When the quality measurements are available, a similar decomposition can be further applied to the residual of quality variablesand the corresponding monitoring scheme can be developed, which is referred to as concurrent monitoring as developed for PLS [10] and CCA [12].

    VI. Case Studies

    A. Case Studies on Simulation Data

    To verify the effectiveness and robustness of LVR, we generate two scenarios in this section, and collinearity is introduced in both scenarios. rLVR, regularized CCA (rCCA)[12] and PLS are performed on the first scenario, and their performance is compared in terms of correlation coefficient,and proportion of variance explained of process and quality variables. In Scenario II, different noise levels are designed to show the robustness of rLVR.

    1) Scenario 1:The following expressions are used to generate data for the first scenario.

    where

    wheree∈R5~N(0,0.22),v∈R4~N(0,0.82), andt∈R5~N(0,(3×i)2),i∈[1,2,...,5]. It is noted that strong collinearity is introduced in bothXandY, where the 2nd and 4th columns and 3rd and 5th columns ofA, and the 2nd and 4th rows ofCare highly dependent.

    800 samples are generated with (37), and the first 600 ones are used as training data while the remaining ones are for test data. The model parameters are selected through cross validation: for rLVR,l=3, and κ=0.001; for rCCA,l=3,κx=0.001, and κy=0.059; and for PLS,l=5.

    The correlation coefficientr, and proportion of variance explained of process variables (PVEx) and quality variables(PVEy) of these models are shown in Figs. 3–5. As presented in Fig. 3, since rLVR pays attention to both correlation betweenXandYand variance ofY, its correlation coefficient and PVEyare highest for the first latent component, leaving less information in the residuals. rCCA focuses on maximizing the correlation between process and quality data. Thus, itsrfor each latent variable is relatively high; however, its ability to exploit process and quality variances is weak, which requires further processing. PLS in Fig. 5 tries to incorporate all three factors (correlation, process variance and quality variance), but the regression relation betweenXandYin PLS model is weakest among the three models, and it requires five principal components to achieve good performance.

    Fig. 3. Correlation coefficient and proportion of variance explained for rLVR in Scenario 1.

    Fig. 4. Correlation coefficient and proportion of variance explained for rCCA in Scenario 1.

    Fig. 5. Correlation coefficient and proportion of variance explained for PLS in Scenario 1.

    2) Scenario 2:The same formulation in (37) is adopted in Scenario II, and the data are generated with different levels of noisevas follows:

    Through cross validation, the model parameters are selected asl=3 and κ=0.001 for rLVR andl=3 for LVR in both cases. To compare the robustness of the models, the following metric is defined [12], which denotes the angle between weighting vectorsrifor different magnitudes of noise.

    wherei∈[1,2,3]. The results of LVR and rLVR are summarized in Table II. As observed from the table, when the noise level increases, the angles between weighting vectors for rLVR is small, and its constructed latent structure is consistent. However, LVR is sensitive to noise, and the resulting angles diverge, which is illustrated in Fig. 1.

    The regularization term in rLVR handles strong collinearcases. However, the value of the regularized parameterκ should not be too large; otherwise, it will have a negative effect on the prediction performance. As shown in Fig. 6, with the increasing values of κ, the mean squared errors (MSEs) of quality variables increase as well, whereYi(i∈{1,2,3,4})denotes theith quality variable. Additionally, an appropriate number of latent variableslis also important for the effectiveness of rLVR: Iflis too small, then the extracted latent variables cannot exploit process and quality spaces fully, leading to a sub-optimal prediction and monitoring performance; On the other hand, iflis too large, the extra latent factors tend to introduce noises into models, which may have negative effects on system modeling. Therefore, it is essential to employ cross-validation to determine the values of κandl.

    TABLE II Angles Between r for LVR and rLVR (°)

    Fig. 6. MSEs of quality variables with increasing κ.

    B. Case Study on the Tennessee Eastman Process

    In this section, the Tennessee Eastman (TE) process [26] is utilized to further demonstrate the effectiveness of the proposed algorithm. TE process was created by the Eastman Chemical Company for the purpose of developing and evaluating the techniques proposed in the process systems engineering field.The process involves five main components, which are reactor,condenser, stripper, compressor and separator. The reactions occurring in the reactor are

    where reactantsA,C,DandEare gases, and main productsGandHand byproductFare liquids.

    Two blocks of data are available in the TE process, which are process measurements XMEAS(1–41) and manipulated variables XMV(1–12). The detailed description of these variables are summarized in [26]. In this case study,XMEAS(1–22) and XMV(1–11) are selected as process variables, and XMEAS(35–36) are chosen as quality variables. Cross validation is applied to choose the model parameters: for rLVR,l=1 and κ=0.013; for CCA,l=1,κx=0.1, and κy=0.001; and for PLS,l=4. In this case study,the regularized LVR and CCA are employed for performance comparison to address the collinearity in the TEP data.

    Downs and Vogel [26] simulated 20 disturbances for further analysis, and two typical ones are selected to compare the performance of rLVR, PLS and CCA in our work, which are IDV(1) and IDV(4).

    1) IDV(1) – a Step Disturbance on A/C Feed Ratio:While keeping the composition ofBconstant, a step change is introduced onA/Cfeed ratio in IDV(1). Figs. 7–9 show the prediction performance of rLVR, PLS and CCA respectively.It is noted that quality variables XMEAS(35) and XMEAS(36) are denoted asY1andY2respectively in the figures. In PLS, a gap exists consistently for XMEAS(36)even after the process returns to normal with the effect of controllers. In contrast, both rLVR and CCA work well to predict the variations or trends of quality variables, with rLVR slightly better in terms of MSEs as shown in Table III. It is noted that rLVR cannot fully follow the trend of quality variables when the disturbance is introduced, and this is caused by the dynamics in the process, which will be addressed with a dynamic extension of rLVR in a future work.

    Fig. 7. Prediction results of rLVR for IDV(1).

    Fig. 8. Prediction results of PLS for IDV(1).

    Fig. 9. Prediction results of CCA for IDV(1).

    TABLE III MSEs of rLVR, CCA and PLS for IDV(1)

    The monitoring performance of rLVR and PLS are shown in Figs. 10 and 11, and CCA’s performance results are omitted due to its negligible differences from rLVR in this data. In Figs. 10 and 11,andQrdenote the monitoring indices for principal component subspace, process principal subspace,and process residual subspace, respectively, whileis the quality monitoring index from performing PCA on quality variables directly. As observed from the figures, for both rLVR and PLS,andQrrespond more quickly than quality monitoringwhere black vertical line denotes the timestamp when disturbance is introduced. Aligning with the prediction results, PLS tends to return to normal after the tuning of controllers, but the false alarms are still consistently raised after Sample 200. In contrast, rLVR follows the quality trends better than PLS, and only process relevant faults are detected inandQrwith lower level of importance.Therefore, due to the emphasis of quality information in modeling phase, rLVR-based prediction and monitoring perform better than PLS.

    Fig. 10. Monitoring results of rLVR for IDV(1).

    Fig. 11. Monitoring results of PLS for IDV(1).

    2) IDV(4) – a Step Change in Reactor Cooling Water Inlet Temperature:In IDV(4), due to the correction of controllers,the quality variables are not affected, and their variations and the predictions of rLVR, PLS and CCA are presented in Figs. 12–14. In terms of prediction performance, rLVR, PLS and CCA achieve comparable results as summarized in Table IV with PLS performing the worst. As validated byvariations in Figs. 15 and 16, the disturbance in IDV(4) is quality-irrelevant. However, PLS raised many false alarms for quality-relevant faults withT2statistic, which reduces the reliability of the fault detection system. The monitoring results for rLVR in Fig. 15 are more credible, which indicate the disturbance affects process variables only.

    Fig. 12. Prediction results of rLVR for IDV(4).

    Fig. 13. Prediction results of PLS for IDV(4).

    Fig. 14. Prediction results of CCA for IDV(4).

    TABLE IV MSEs of rLVR, CCA and PLS for IDV(4)

    VII. Conclusions

    In this paper, a new regularized latent variable regression(rLVR) method is proposed for multivariate modeling and process monitoring. rLVR aims to maximize the projection of quality variables on the latent spaces, which is shown to be equivalent to minimizing the prediction error between process and quality scores. The geometric properties and model relations are derived and summarized for rLVR, and the relation among rLVR, PLS and CCA is analyzed both theoretically and geometrically. The process monitoring framework based on rLVR is developed to detect anomalies in principal component subspace, process principal subspace and process residual subspace. Two case studies, numerical simulations and the Tennessee Eastman process, are employed to demonstrate the effectiveness of rLVR over PLS and CCA in terms of prediction and monitoring.

    Fig. 15. Monitoring results of rLVR for IDV(4).

    Fig. 16. Monitoring results of PLS for IDV(4).

    Appendix A Proof of Lemma 1

    With relation in (8), the objective of LVR in (5) can be rearranged as

    where θ is the angle betweenuandt, and its range is [ 0,180?].Additionally, since the direction oftorumakes no difference on the minimum value ofJ, the range of θ can be further restricted to [ 0,90?]. Therefore, the objective is equivalent to

    By substitutingt=Xwandu=Yq, the equivalence between (5) and (13) is proved.

    Appendix B Proof of Lemma 2

    1) From Algorithm 1 and (16), we have

    wher e

    Then the first item in Lemma 2 can be proved by

    2) Another way to representXiis

    where

    Additionally, provingis equivalent to show

    It is noted that the last two items in Lemma 2,andcan be proved in a similar way to the first two items; thus, their proof is omitted in the paper.

    The loading matrixPcan be expressed as

    Appendix C Proof of Lemma 3

    ThenRT Pis proved to be an identity matrix as follow:

    Thus,

    久久久久久久久久久久大奶| 人妻系列 视频| 久久久久久伊人网av| 99国产精品免费福利视频| 韩国av在线不卡| 久久久久网色| 国产精品久久久久久精品电影小说| 国产精品久久久久久精品电影小说| 国产视频内射| 男人狂女人下面高潮的视频| 亚洲欧美日韩另类电影网站| 一级毛片 在线播放| 99国产精品免费福利视频| 在线观看一区二区三区激情| 国产精品一二三区在线看| 最近手机中文字幕大全| 久久精品久久久久久久性| 99久久精品一区二区三区| 午夜福利在线观看免费完整高清在| 亚洲激情五月婷婷啪啪| 亚洲精品日韩av片在线观看| 亚洲欧美日韩东京热| 天堂8中文在线网| 国产精品福利在线免费观看| 日本免费在线观看一区| 国产黄片美女视频| 99视频精品全部免费 在线| 国产成人精品无人区| 亚洲精品,欧美精品| www.色视频.com| 有码 亚洲区| a级毛片免费高清观看在线播放| 波野结衣二区三区在线| 亚洲不卡免费看| 亚洲国产精品成人久久小说| 日韩亚洲欧美综合| 国产日韩欧美在线精品| 久久热精品热| 夫妻午夜视频| 天堂俺去俺来也www色官网| 国产在线一区二区三区精| 男人添女人高潮全过程视频| 日本爱情动作片www.在线观看| 国产高清不卡午夜福利| 丁香六月天网| 少妇 在线观看| 老司机影院成人| 久久久久久人妻| 夜夜爽夜夜爽视频| 精品国产国语对白av| 波野结衣二区三区在线| 另类精品久久| 中文乱码字字幕精品一区二区三区| 亚洲av不卡在线观看| 成年女人在线观看亚洲视频| av专区在线播放| 国产精品一区二区在线不卡| 夜夜爽夜夜爽视频| 男人和女人高潮做爰伦理| 久久久久人妻精品一区果冻| 欧美国产精品一级二级三级 | 我的女老师完整版在线观看| 少妇被粗大猛烈的视频| 菩萨蛮人人尽说江南好唐韦庄| 久久99蜜桃精品久久| 免费看av在线观看网站| 国产精品女同一区二区软件| 日本av免费视频播放| 黄片无遮挡物在线观看| 深夜a级毛片| 一级毛片久久久久久久久女| 女性被躁到高潮视频| 国产美女午夜福利| 久久久久久久精品精品| 91aial.com中文字幕在线观看| 欧美日韩一区二区视频在线观看视频在线| 人妻制服诱惑在线中文字幕| 夫妻性生交免费视频一级片| 男女啪啪激烈高潮av片| 亚洲国产av新网站| 国产免费福利视频在线观看| av专区在线播放| 亚洲精品一区蜜桃| 亚洲成人一二三区av| 久热久热在线精品观看| 人人妻人人澡人人爽人人夜夜| 搡女人真爽免费视频火全软件| 王馨瑶露胸无遮挡在线观看| 精品人妻一区二区三区麻豆| 亚洲电影在线观看av| av在线观看视频网站免费| 日韩欧美 国产精品| 亚洲成人一二三区av| 青青草视频在线视频观看| 免费av中文字幕在线| 国产欧美另类精品又又久久亚洲欧美| av卡一久久| 久久精品国产鲁丝片午夜精品| 熟女电影av网| 亚洲内射少妇av| 熟女人妻精品中文字幕| 国产探花极品一区二区| 国产欧美日韩综合在线一区二区 | 国产亚洲午夜精品一区二区久久| 日韩av不卡免费在线播放| 女人久久www免费人成看片| 欧美人与善性xxx| 人妻一区二区av| 国产精品人妻久久久久久| 久久精品国产亚洲av涩爱| 一个人看视频在线观看www免费| 国产老妇伦熟女老妇高清| 一区二区av电影网| 边亲边吃奶的免费视频| 大香蕉久久网| 制服丝袜香蕉在线| 亚洲精品aⅴ在线观看| 日日啪夜夜撸| 亚洲国产精品一区三区| 高清欧美精品videossex| 国产成人aa在线观看| 少妇人妻久久综合中文| 麻豆成人av视频| av黄色大香蕉| 成年女人在线观看亚洲视频| 免费少妇av软件| 国产伦理片在线播放av一区| 久久精品久久久久久噜噜老黄| 少妇猛男粗大的猛烈进出视频| 国产精品伦人一区二区| 成年女人在线观看亚洲视频| 国产成人精品一,二区| 午夜老司机福利剧场| 最近最新中文字幕免费大全7| 免费人妻精品一区二区三区视频| 日本与韩国留学比较| 五月伊人婷婷丁香| 2021少妇久久久久久久久久久| 精品99又大又爽又粗少妇毛片| 欧美 日韩 精品 国产| a 毛片基地| av又黄又爽大尺度在线免费看| 亚洲美女黄色视频免费看| 深夜a级毛片| 人人妻人人澡人人看| 高清不卡的av网站| 国产日韩欧美在线精品| 肉色欧美久久久久久久蜜桃| 少妇被粗大猛烈的视频| 亚洲精品乱久久久久久| www.av在线官网国产| 精品人妻一区二区三区麻豆| 黑人巨大精品欧美一区二区蜜桃 | 一级毛片久久久久久久久女| 男人狂女人下面高潮的视频| 免费av不卡在线播放| 国产一区二区在线观看av| 观看美女的网站| 高清欧美精品videossex| 国产精品久久久久成人av| 日本欧美国产在线视频| 国产男女内射视频| 亚洲成色77777| 交换朋友夫妻互换小说| 22中文网久久字幕| 国产精品福利在线免费观看| 色哟哟·www| 一区二区三区免费毛片| 自线自在国产av| 人人妻人人爽人人添夜夜欢视频 | 久久热精品热| 亚洲伊人久久精品综合| 亚洲怡红院男人天堂| 亚洲av电影在线观看一区二区三区| 亚洲欧美中文字幕日韩二区| 亚洲无线观看免费| 中国三级夫妇交换| 亚洲欧美中文字幕日韩二区| 国产在线免费精品| 日韩中文字幕视频在线看片| 国产亚洲一区二区精品| 精品国产一区二区三区久久久樱花| 日韩三级伦理在线观看| 美女视频免费永久观看网站| 欧美日韩综合久久久久久| xxx大片免费视频| 欧美日韩av久久| 免费看光身美女| 99热网站在线观看| 国产成人aa在线观看| 日韩,欧美,国产一区二区三区| 一级毛片 在线播放| 久久久久国产精品人妻一区二区| 性高湖久久久久久久久免费观看| 免费久久久久久久精品成人欧美视频 | 高清午夜精品一区二区三区| 日韩成人av中文字幕在线观看| 亚洲不卡免费看| 欧美三级亚洲精品| 91在线精品国自产拍蜜月| 乱人伦中国视频| 国产女主播在线喷水免费视频网站| 久久ye,这里只有精品| 香蕉精品网在线| 大陆偷拍与自拍| 2018国产大陆天天弄谢| 中文字幕制服av| 免费观看在线日韩| 高清不卡的av网站| 午夜视频国产福利| 九九久久精品国产亚洲av麻豆| 中文乱码字字幕精品一区二区三区| 久久av网站| 亚洲av成人精品一二三区| 2021少妇久久久久久久久久久| 91精品一卡2卡3卡4卡| 看非洲黑人一级黄片| 最后的刺客免费高清国语| 免费人成在线观看视频色| 国产精品福利在线免费观看| 一本一本综合久久| 黄色日韩在线| 国产精品欧美亚洲77777| 极品教师在线视频| 久久久亚洲精品成人影院| 少妇猛男粗大的猛烈进出视频| 欧美97在线视频| 日本av免费视频播放| 一区二区三区乱码不卡18| 久久这里有精品视频免费| 看非洲黑人一级黄片| 国产精品嫩草影院av在线观看| 一个人看视频在线观看www免费| 精品国产乱码久久久久久小说| 99九九线精品视频在线观看视频| 精品人妻熟女av久视频| 少妇熟女欧美另类| 久久久久久久大尺度免费视频| 99re6热这里在线精品视频| 我的老师免费观看完整版| 国产一区二区在线观看日韩| 天堂8中文在线网| 九九爱精品视频在线观看| 91久久精品电影网| 亚洲成人手机| 另类亚洲欧美激情| 日韩三级伦理在线观看| 久久精品国产亚洲av天美| 男的添女的下面高潮视频| 久久韩国三级中文字幕| 久久久久久久久久成人| 亚洲av男天堂| 观看美女的网站| 高清黄色对白视频在线免费看 | 亚洲国产av新网站| 中文精品一卡2卡3卡4更新| 少妇丰满av| 日韩不卡一区二区三区视频在线| 菩萨蛮人人尽说江南好唐韦庄| 免费久久久久久久精品成人欧美视频 | 日韩精品有码人妻一区| 99九九线精品视频在线观看视频| 老司机影院成人| 在线免费观看不下载黄p国产| 精品久久久精品久久久| 国产在视频线精品| 国产亚洲精品久久久com| 久热久热在线精品观看| a级一级毛片免费在线观看| 人人妻人人澡人人爽人人夜夜| 3wmmmm亚洲av在线观看| 亚洲精品第二区| 波野结衣二区三区在线| 久久99热这里只频精品6学生| 日韩视频在线欧美| 亚洲国产av新网站| 99久久综合免费| 中文字幕亚洲精品专区| 777米奇影视久久| 国产一区二区在线观看av| 3wmmmm亚洲av在线观看| 国产极品粉嫩免费观看在线 | 丝袜喷水一区| 人妻 亚洲 视频| 人体艺术视频欧美日本| 亚洲美女视频黄频| 狂野欧美白嫩少妇大欣赏| 99视频精品全部免费 在线| 亚州av有码| 日韩 亚洲 欧美在线| 伊人久久国产一区二区| 新久久久久国产一级毛片| 亚洲国产毛片av蜜桃av| 秋霞伦理黄片| 亚洲成人av在线免费| 欧美日韩精品成人综合77777| 欧美精品国产亚洲| 国产 精品1| 久久婷婷青草| 蜜桃在线观看..| 91成人精品电影| 乱人伦中国视频| 国产精品.久久久| 午夜日本视频在线| 国产成人免费观看mmmm| 欧美3d第一页| 亚洲精品日韩在线中文字幕| 老司机影院成人| 亚洲经典国产精华液单| 观看美女的网站| av视频免费观看在线观看| 十分钟在线观看高清视频www | 久久精品国产亚洲av天美| 91精品一卡2卡3卡4卡| 伦精品一区二区三区| 亚洲成人av在线免费| 男女啪啪激烈高潮av片| 中文精品一卡2卡3卡4更新| 嫩草影院新地址| 国产精品伦人一区二区| 男女边摸边吃奶| 我要看黄色一级片免费的| 亚洲中文av在线| 成人18禁高潮啪啪吃奶动态图 | a级一级毛片免费在线观看| 成人国产av品久久久| 亚洲三级黄色毛片| 一本色道久久久久久精品综合| 国产日韩欧美视频二区| 成年女人在线观看亚洲视频| 中文资源天堂在线| 涩涩av久久男人的天堂| 久久久久久人妻| 国产色婷婷99| 国产欧美亚洲国产| 欧美xxⅹ黑人| 日韩三级伦理在线观看| 日日啪夜夜爽| 我的老师免费观看完整版| 丰满迷人的少妇在线观看| 51国产日韩欧美| 男人和女人高潮做爰伦理| 麻豆成人午夜福利视频| 韩国高清视频一区二区三区| 日韩一区二区视频免费看| 一级毛片黄色毛片免费观看视频| 亚洲国产av新网站| 国产成人aa在线观看| 欧美日韩综合久久久久久| 午夜福利视频精品| 黄色一级大片看看| 亚洲色图综合在线观看| 国产成人freesex在线| 视频中文字幕在线观看| 精品人妻熟女毛片av久久网站| 美女xxoo啪啪120秒动态图| 男女无遮挡免费网站观看| 丰满人妻一区二区三区视频av| 美女视频免费永久观看网站| 如日韩欧美国产精品一区二区三区 | 欧美3d第一页| 精品99又大又爽又粗少妇毛片| 国产亚洲av片在线观看秒播厂| 国产伦精品一区二区三区四那| 国产精品一区二区在线观看99| 国产亚洲5aaaaa淫片| 欧美另类一区| 少妇精品久久久久久久| 亚洲精品乱久久久久久| 黄色欧美视频在线观看| 日韩三级伦理在线观看| 国产成人精品一,二区| 日本vs欧美在线观看视频 | 久久国产精品大桥未久av | 日韩 亚洲 欧美在线| 国产淫片久久久久久久久| 国产欧美日韩一区二区三区在线 | 国产精品伦人一区二区| 熟女av电影| 日韩中文字幕视频在线看片| 中文在线观看免费www的网站| 精品国产一区二区久久| 国产精品久久久久久精品电影小说| 国产成人freesex在线| 国产熟女午夜一区二区三区 | 欧美亚洲 丝袜 人妻 在线| 日韩精品有码人妻一区| 欧美日韩视频高清一区二区三区二| av女优亚洲男人天堂| 蜜桃久久精品国产亚洲av| 成人亚洲欧美一区二区av| 99九九线精品视频在线观看视频| 亚洲av成人精品一区久久| 汤姆久久久久久久影院中文字幕| 成人国产麻豆网| 免费播放大片免费观看视频在线观看| 久久久精品免费免费高清| 人人妻人人爽人人添夜夜欢视频 | 日本色播在线视频| 久久国产亚洲av麻豆专区| 国产日韩欧美视频二区| 尾随美女入室| 男女边摸边吃奶| 日韩av免费高清视频| 九九爱精品视频在线观看| 精品国产乱码久久久久久小说| 极品教师在线视频| 22中文网久久字幕| 嫩草影院入口| 久久久久久久亚洲中文字幕| 尾随美女入室| 国产毛片在线视频| 色网站视频免费| 国产一区二区在线观看av| 91在线精品国自产拍蜜月| 中文在线观看免费www的网站| h日本视频在线播放| 日韩中文字幕视频在线看片| 99热这里只有精品一区| 极品人妻少妇av视频| 狂野欧美激情性bbbbbb| .国产精品久久| 最近最新中文字幕免费大全7| 美女福利国产在线| 美女cb高潮喷水在线观看| 黑人巨大精品欧美一区二区蜜桃 | 久久久a久久爽久久v久久| 97超碰精品成人国产| 成人黄色视频免费在线看| 亚洲精品456在线播放app| 自拍偷自拍亚洲精品老妇| 在线观看www视频免费| 女性生殖器流出的白浆| 亚洲欧洲日产国产| 另类精品久久| 久久精品国产亚洲av涩爱| 亚洲精品久久久久久婷婷小说| 777米奇影视久久| 欧美日韩国产mv在线观看视频| 国产高清三级在线| 国产成人精品无人区| 人人澡人人妻人| 黄色欧美视频在线观看| 日本猛色少妇xxxxx猛交久久| 99久久精品一区二区三区| 在线观看国产h片| 高清视频免费观看一区二区| 国产成人a∨麻豆精品| 国产精品伦人一区二区| 精品国产一区二区三区久久久樱花| 亚洲在久久综合| 亚洲精品乱码久久久久久按摩| 少妇 在线观看| 最新的欧美精品一区二区| 三级国产精品片| 观看av在线不卡| 久久久久久人妻| 久久久久久伊人网av| 国产色婷婷99| 青春草亚洲视频在线观看| 国产成人一区二区在线| 一区二区三区四区激情视频| 久久久久久久久久人人人人人人| 国模一区二区三区四区视频| 亚洲av二区三区四区| 日韩免费高清中文字幕av| 中文字幕人妻丝袜制服| 国产淫片久久久久久久久| 欧美精品人与动牲交sv欧美| 另类精品久久| 国产成人91sexporn| 国产高清三级在线| 日韩强制内射视频| 18+在线观看网站| 国产日韩一区二区三区精品不卡 | 亚洲精品一二三| 搡老乐熟女国产| 内射极品少妇av片p| 亚洲色图综合在线观看| 国产av精品麻豆| 简卡轻食公司| 国产精品福利在线免费观看| 亚洲精品乱码久久久v下载方式| 女人精品久久久久毛片| 精品一区二区三卡| 我要看日韩黄色一级片| 国产精品一区二区三区四区免费观看| 国产日韩一区二区三区精品不卡 | 国产乱来视频区| 国产男女超爽视频在线观看| 一级a做视频免费观看| 国产精品久久久久久精品古装| 国产在线免费精品| 久久久久久伊人网av| 在线观看www视频免费| 国产av精品麻豆| 欧美日韩视频高清一区二区三区二| 国产高清三级在线| 久久人人爽人人爽人人片va| 国产午夜精品一二区理论片| 国产高清有码在线观看视频| 熟女电影av网| 亚洲欧美一区二区三区国产| 免费观看a级毛片全部| 高清毛片免费看| 亚洲精品色激情综合| 80岁老熟妇乱子伦牲交| 日本爱情动作片www.在线观看| .国产精品久久| 亚洲精品日本国产第一区| 国产精品国产三级专区第一集| 久久久久久久大尺度免费视频| 色吧在线观看| 少妇被粗大的猛进出69影院 | 免费人妻精品一区二区三区视频| 九九爱精品视频在线观看| 日韩中文字幕视频在线看片| 亚洲无线观看免费| 人妻人人澡人人爽人人| 亚洲精品久久久久久婷婷小说| 内地一区二区视频在线| 国产男人的电影天堂91| 在线精品无人区一区二区三| 午夜福利影视在线免费观看| 蜜臀久久99精品久久宅男| 中文字幕制服av| 在线观看三级黄色| 中文字幕久久专区| 我的女老师完整版在线观看| 久久久久久久大尺度免费视频| 在线观看av片永久免费下载| 欧美 亚洲 国产 日韩一| 一级毛片aaaaaa免费看小| 久久免费观看电影| 精品少妇内射三级| 成年女人在线观看亚洲视频| xxx大片免费视频| 免费少妇av软件| 亚洲美女视频黄频| 一级av片app| 2022亚洲国产成人精品| 欧美xxxx性猛交bbbb| 日本av免费视频播放| av在线app专区| 2018国产大陆天天弄谢| www.av在线官网国产| 中文字幕av电影在线播放| 久久久久国产网址| 国产综合精华液| 欧美97在线视频| 99久久精品热视频| 2018国产大陆天天弄谢| 在线亚洲精品国产二区图片欧美 | 全区人妻精品视频| 成人影院久久| 亚洲第一av免费看| 亚洲精品久久久久久婷婷小说| 色婷婷久久久亚洲欧美| 99视频精品全部免费 在线| 国产精品嫩草影院av在线观看| 精品熟女少妇av免费看| 免费在线观看成人毛片| www.色视频.com| 久久av网站| 男人和女人高潮做爰伦理| 看非洲黑人一级黄片| 国产av一区二区精品久久| 午夜91福利影院| 日产精品乱码卡一卡2卡三| 最近的中文字幕免费完整| 黄片无遮挡物在线观看| 熟妇人妻不卡中文字幕| 日日撸夜夜添| 国产91av在线免费观看| 99热国产这里只有精品6| 国产精品国产三级国产专区5o| 免费观看av网站的网址| 婷婷色综合www| 成人免费观看视频高清| 久久 成人 亚洲| 亚洲成人av在线免费| 亚洲av免费高清在线观看| 肉色欧美久久久久久久蜜桃| 最后的刺客免费高清国语| 交换朋友夫妻互换小说| 少妇熟女欧美另类| 搡女人真爽免费视频火全软件| 熟女av电影| 91精品伊人久久大香线蕉| 制服丝袜香蕉在线| 欧美 亚洲 国产 日韩一| 欧美日韩综合久久久久久| 午夜激情久久久久久久| 2018国产大陆天天弄谢| 国产精品人妻久久久影院| 91精品国产九色| 美女中出高潮动态图| 色哟哟·www| 嘟嘟电影网在线观看| 国产国拍精品亚洲av在线观看| 一级毛片aaaaaa免费看小| 能在线免费看毛片的网站| 两个人的视频大全免费| 九九爱精品视频在线观看| 街头女战士在线观看网站| 日韩精品免费视频一区二区三区 | 乱人伦中国视频| 少妇人妻久久综合中文| 伊人久久精品亚洲午夜| 午夜激情福利司机影院| 深夜a级毛片| 亚洲国产毛片av蜜桃av| 十分钟在线观看高清视频www | 国产精品秋霞免费鲁丝片| av黄色大香蕉|