• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A Hybrid Deep Learning-Based Unsupervised Anomaly Detection in High Dimensional Data

    2022-03-14 09:25:08AmgadMuneerShakirahMohdTaibSulimanMohamedFatiAbdullateefBalogunandIzzatdinAbdulAziz
    Computers Materials&Continua 2022年3期

    Amgad Muneer,Shakirah Mohd Taib,Suliman Mohamed Fati,Abdullateef O.Balogun and Izzatdin Abdul Aziz

    1Department of Computer and Information Sciences,Universiti Teknologi PETRONAS,Seri Iskandar,32160,Malaysia

    2Centre for Research in Data Science(CERDAS),Universiti Teknologi PETRONAS,Seri Iskandar,32610,Perak,Malaysia

    3Information Systems Department,Prince Sultan University,Riyadh,11586,Saudi Arabia

    Abstract: Anomaly detection in high dimensional data is a critical research issue with serious implication in the real-world problems.Many issues in this field still unsolved, so several modern anomaly detection methods struggle to maintain adequate accuracy due to the highly descriptive nature of big data.Such a phenomenon is referred to as the“curse of dimensionality”that affects traditional techniques in terms of both accuracy and performance.Thus, this research proposed a hybrid model based on Deep Autoencoder Neural Network(DANN)with five layers to reduce the difference between the input and output.The proposed model was applied to a real-world gas turbine(GT)dataset that contains 87620columns and 56 rows.During the experiment,two issues have been investigated and solved to enhance the results.The first is the dataset class imbalance,which solved using SMOTE technique.The second issue is the poor performance,which can be solved using one of the optimization algorithms.Several optimization algorithms have been investigated and tested, including stochastic gradient descent (SGD), RMSprop, Adam and Adamax.However, Adamax optimization algorithm showed the best results when employed to train the DANN model.The experimental results show that our proposed model can detect the anomalies by efficiently reducing the high dimensionality of dataset with accuracy of 99.40%,F1-score of 0.9649,Area Under the Curve (AUC) rate of 0.9649,and a minimal loss function during the hybrid model training.

    Keywords: Anomaly detection; outlier detection; unsupervised learning;autoencoder; deep learning; hybrid model

    1 Introduction

    Nowadays, a huge amount of data is produced periodically at an unparalleled speed from diverse and composite origins such as social media, sensors, telecommunication, financial transactions, etc.[1,2].Such acceleration in data generation yields to the concept of big data, which can be attributed to the umpteen and dynamism in technological advancements.For instance, the emergence of the Internet of Things (IoT) and the increase in smart devices usages (wearables and non-wearables) have contributed to the upsurge in the continuous generation of data [3].As defined by Gandomi et al.[4], big data can be described as high-volume, high-velocity, and highvariety datasets where knowledge or insights can be derived using data analytic tools.Moreover,big data is conceptualized as the 5 Vs (Value, Veracity, Variety, Velocity and Volume) [5].As shown in Fig.1, Value depicts the advantage of data analysis; veracity shows the level of accuracy while Variety represents the different kinds of data (structured, semi-structured, and unstructured)present in big data [6].Concerning Volume, it shows the magnitude of data being processed or stored.However, an increment in the volume of data leads to an increase in the dimensionality of such data.Dimensionality is the number of features or attributes present in each dataset.On the other hand, Velocity represents the rate at which data are produced which may consist of several dimensions.The preceding statements showed how the 5 Vs of big data addresses its underlining limitations [7].Nonetheless, the dimensionality of data, which is proportional to the volume of the data is somewhat overlooked.Increment or large dimensions of data could negatively affect the extraction of knowledge from a dataset.That is, high dimensionality can affect data analytics such as anomaly detection in a large dataset.

    Figure 1: High dimensionality problem in big data [5]

    Anomaly detection points to the challenge of detecting trends in data that do not correspond to anticipated behavior [8].In various implementation domains, these non-conforming patterns are referred to as deviations, outliers, discordant observations, variations, aberrations, shocks,peculiarities, or pollutants [9,10].The existence of anomalies in each dataset can be seen as a data quality problem as it can lead to undesired outcomes if not removed [11,12].As such, the removal of anomalous points from a dataset leads to data quality improvement, which makes the given dataset an imperative [13].Besides, closeness of the data objects to one another yields to the high dimensionality in datasets, which will lead to the ambiguity in the respective data distances [14].Although there are several detection techniques which require sophisticated and efficient computational approaches [8,15], the conventional anomaly detection techniques cannot adequately handle or address the high-dimensionality issue.Besides, many of these conventional anomaly detection techniques infer that the data have uniform attributes or features.On the contrary, real-life datasets in most cases have diverse types of attributes.This observation points to a heightened problem in anomaly detection [5,8,15].

    Several anomaly detection techniques have been proposed across different application domains [14-18].Neighbor-based anomaly techniques such as LOF, kNNW, ODIN detects anomalies using neighborhood information from data points [19-21].These methods record poor performance and are somewhat conscious of the parameters of proposed methods.Another instance is the recent ensemble-based anomaly detection techniques.Zimek et al.[22] and Pasillas-Díaz et al.[23] proposed an ensemble-based anomaly detection method with good performance.However, ensemble methods are a black-box mechanism that lacks explain ability.Besides, selecting the most applicable and appropriate meta-learner is an open research problem.Another example, Wilkinson [24] proposed an unsupervised algorithm known as HDoutliers that can detect anomalies in high-dimensional datasets.Findings from the comparative study by Talagala et al.[25] also corroborate the efficiency of the HDoutliers algorithm.However, the tendency of HDoutliers to increase the false negatives rate is its drawback.Chalapathy et al.[26], Wu et al.[27], and Favarelli et al.[28], in their respective studies, proposed One-Class Neural Network anomaly detection methods for small and large-scale datasets.Also, Malhotra et al.[29], Nguyen et al.[30], Zhou et al.[17], and Said Elsayed et al.[31] developed anomaly detection methods based on long short-term memory (LSTM).However, these existing methods cannot handle the class imbalance problem.

    Instigated by the preceding problems, this study proposes a novel hybrid deep learningbased approach for anomaly detection in large-scale datasets.Specifically, a data sampling method and multi-layer deep autoencoder with Adamax optimization algorithm is proposed.Synthetic Minority Over-sampling Technique (SMOTE) is used as a data sampling method to resolve the inherent class imbalance problem by augmenting the number of minority class instances to the level of the majority class label.A novel deep autoencoder neural network (DANN) with Adamax optimization algorithm is used for detecting anomaly and reducing dimensionality.The primary contributions of this work are summarized as thus:

    ? A novel DANN approach to detect anomalies in time series by the unsupervised mode.

    ? Hybridization of SMOTE data sampling and DANN to overcome inherent class imbalance problem.

    ? We addressed and overcame the curse of dimensionality in data by applying a multilayer autoencoder model that can find optimal parameter values and minimize the difference between the input and the output using deep reconstruction error during the model training.

    The rest of this paper is structured as follows.Section 2 highlighted the background and the related works of the study.Section 3 outlines this work’s research methodology, while Section 4 describes the experimental findings.Lastly, Section 5 concludes the paper and highlights the future work.

    2 Background and Related Work

    Anomaly detection is a well-known issue in a variety of fields, so different approaches have been proposed to mitigate this issue recently.Further information about this issue can be found in [5,32-35].In this section, we will look at some of the more common anomaly detection techniques, and the relevant enhancements.

    One of the commonly used anomaly detection technique is called neighbor-based anomaly detection technique whereby the outliers are identified based on the neighborhood information.Thus, the anomaly is scored as the average or weighted distance between the data object and its k nearest neighbors [19,21].Another option is using the local outlier factor (LOF) to determine the anomaly degree whereby the anomaly score is calculated in accordance with its neighborhood [36].Likewise, Hautamaki et al.[20] proposed an Outlier Detection using Indegree Number (ODIN),which is based on kNN graph, whereby data instances are segregated based on their respective influence in its neighborhood.It is worth mentioning that all the above-mentioned neighbor-based detection methods are independent of data distributions and can detect isolated entities.However,their success is heavily reliant on distance scales, which is unreliable or insignificant in the highdimensional spaces.Thus, considering the ranking of neighbors is a viable solution to overcome this issue as the existence of high-dimensional data still makes the ranking of each object’s nearest neighbors significant.The underlying assumption is that if the same process created two objects,they would most likely become nearest neighbors or have similar neighbors [37].

    Another applicable approach is deploying the subspace learning method.Sub-space-based anomaly detection approaches try to locate anomalies by sifting through various subsets of dimensions in an orderly manner.According to Zimek et al.[22], only a subset of relevant features for an object in a high dimensional space provides useful information, while the rest are unrelated to the mission.The presence of irrelevant features can make the anomaly detection process challenging to distinct.Another direction is using sparse subspace technique, which is a kind of subspace technique.Both points in a high-dimensional space are projected onto one or more low-dimensional, called sparse subspaces in this case [38,39].As a result, objects that collapse into sparse subspaces are considered anomalies due to their abnormally low densities.It should be noted, however, such examination of the feature vectors from the whole highdimensional space takes time [38,40].Therefore, to improve exploration results, Aggarwal et al.[41]used an evolutionary algorithm, whereby a space projection was described as a subspace with the most negative scarcity coefficients.However, certain factors, such as the original species, fitness functions, and selection processes, have a substantial effect on the effects of the evolutionary algorithm.The disadvantage of this method seems to be relaying on a large amount of data to identify the variance trend.

    Ensemble learning is another feasible anomaly detection approach.This can be attributed to it efficiency over baseline methods [22,42,43].Specifically, feature bagging and subsampling have been deployed in aggregating anomaly scores and pick the optimal value.For instance, Lazarevic et al.[44], in their study randomly selects feature samples from the initial feature space.An anomaly detection algorithm is then used to approximate the score of each item on each function subset.The scores for the same item are then added together to form the final score.On the other hand, Nguyen et al.[45] estimated anomaly scores for objects on random subspaces using multiple detection methods rather than the same one.Similarly, Keller et al.[46] suggested a modular anomaly detection approach that splits the anomaly mining mechanism into two parts, subspace search and anomaly ranking.Using the Monte Carlo sampling method, the subspace scan aims to obtain high contrast subspaces (HiCS), and then the LOF scores of artefacts are aggregated on the obtained subspaces.Van Stein et al.[40] took it a step further by accumulating similar HiCS subspaces and then measuring the anomaly scores of entities using local anomaly probabilities in the global feature space.For instance, Zimek et al.[22] used the random subsampling method to find each object’s closest neighbours and then estimate its local density.This ensemble approach,when used in conjunction with an anomaly detection algorithm, is more efficient and yields a more complex range of data.There are several approaches for detecting anomalies that consider both attribute bagging and subsampling.Pasillas-Díaz et al.[23], for example, used bagging to collect various features at each iteration and then used subsampling to measure anomaly scores for different subsets of data.Using bagging ensemble, however, it is difficult to achieve entity variation, and the results are vulnerable to the scale of subsampled datasets.It is important to note that most of the above-mentioned anomaly detection methods can only process numerical data, leading to low efficacy.Moreover, most of preceding studies failed to investigate the concept of class imbalance that is an inherent problem in machine learning and present in most datasets.Thus, this proposed study proposes a novel hybrid deep learning-based approach for anomaly detection in large-scale datasets.

    3 Materials and Methods

    This section describes the gas turbine (GT) dataset, the real-world data utilized for anomalies detection in the high-dimensional dataset.It discussed the various techniques used for dimensionality reduction and features optimization and the different stages of the proposed hybrid model.

    3.1 Dataset Description

    The dataset used in this research is real industry high dimensional data for a gas turbine.This data contains 87620 columns and 56 rows.In this study, the data has been splits into a training set and testing set with a ratio of 60:40.Detecting anomalies in real-world high-dimensional data is a theoretical and a practical challenge due to “curse of dimensionality” issue, which is widely discussed in the literature [47,48].Therefore, we have utilized a deep autoencoder algorithm composed of two symmetrical deep belief networks comprised of four shallow layers.Between them, half of the network is responsible for encoding, and the other half is responsible for decoding.The autoencoder learns significant present features in the data through minimizing the reconstruction error between the input and output data.Particularly, the high-dimensional noisy data is a common, so the first step is to reduce the dimension of the data.During this process,the data are projected on a space of lower dimension, thus the noise is eliminated and only the essential information is preserved.Accordingly, Deep Autoencoder Neural Network (DANN)Algorithm is used in this paper to reduce the data noise.

    3.2 The Proposed Deep Autoencoder Neural Network(DANN)Algorithm

    An autoencoder is a particular type of artificial neural network utilized primarily for handling tasks of unsupervised machine learning [49-51].Like the works in [52-56], this study utilizes the autoencoders for both dimensionality reduction and detecting anomalies.Autoencoder is composed of two components: an encoder and a decoder.The encoder’s output is a compressed representation of the input pattern described in terms of a vector function.First, the autoencoder learns the data presentation (encoding) of data set through the process of network training to ignore the “noise”.The goal of this process is to reduce dimensionality [57,58].Second,the autoencoder tries to produce a compressed representation, which is as close as possible to its original input, from the reduced encoding.As depicted in Fig.2, the input, mapping, and bottleneck layers of the DANN estimate the mapping functions that bring the original data into the main component space of the lower dimension [59], whereas the demapping and output layers estimated the demapping functions that carry the original data space back to the projected data.

    Figure 2: Architecture of deep autoencoder neural network (DANN)

    The proposed DANN has the following mathematical model form:

    wherexdonates the input vectors,ymdonates the mapping layer,tdonates the bottleneck layer,yddonates the demapping layer, andx?represent the output layer.b and W are bias vectors and weight matrices, correspondingly.Besides,adenotes the non-linear activation function.Fig.2 summarizes the dimensions of both the matrices and vectors.The objective of auto-associative neural network training is to determine optimal parameter values (i.e., “optimal values ofWandb”) that reduce the input and output differences, it can be computed as given in Eq.(2):

    which is also called the reconstruction error.

    3.3 Objective Functions for Autoencoder Neural Network Training

    Apart from the reconstruction error specified in Eq.(2), three objective functions can be used to train autoencoder neural networks.We describe two alternative objective functions in this section: hierarchical error and denoising criterion.The authors in [60] proposed the concept of hierarchical error to establish a hierarchy (i.e., relative importance) amongst non-linear principal components Analysis (PCA), which is utilizing the reconstruction error as the objective function [61].Thus, it demonstrated that maximization of the principal variable variance is equal to the residual variance minimization in linear PCA.Accordingly, the reconstruction of hierarchical error could be described as:

    The authors in [62] suggested the denoising criterion to derive more stable principal components.To employ the denoising criterion, the corrupted inputxis produced by adding noise to the original inputx, like masking noise and Gaussian noise.Subsequently, the autoencoder neural network is trained to retrieve the original input using the corrupted data as input.Finally, the denoising criterion was used to demonstrate the ability of autoencoder neural networks to learn a lower-dimensional manifold.Thus, it will capture more essential patterns in the original data.Fig.3 summarizes the three-goal functions schematically.

    Figure 3: Different objective functions are depicted schematically: (a) Reconstruction error; (b)Hierarchical error; and (c) Denoising criterion [61]

    Based on the above, we have designed a similar procedure for dimensionality reduction utilizing DANN model.First, the matrix of the original data is partitioned into two split sets,to contain only the usual operating data.One set for the training purpose and another for DANN model testing.Second, the autoencoder neural network is trained to make use of the training dataset.Once it is trained, the autoencoder neural network start computing the principal components and residuals by feeding a new data sample.This is followed by determining the T2and Q statistics as follows:

    where tkdenotes the principal component value of kthin the latest sample of data, andσkdenotes the kthprincipal component standard deviation as determined from the training dataset.It is a worth mention that the upper control limits were set with assuming the compliance of the data with a multivariate normal distribution.Thus, a different approach was followed in this work by calculating the upper control limits for two statistics directly from the given large dataset without assuming any possible distribution form.For instance, with a hundred samples of normal training data, the next biggest T2(or Q) value is chosen as the upper control limit to attain a false alarm rate of 0.01.

    3.4 Synthetic Minority Oversampling Technique(SMOTE)

    Resampling the data, including undersampling and oversampling, is one of the prominent approaches to relieve ths issue of imbalanced dataset [63].Oversampling techniques are preferable over undersampling techniques in most circumstances [64].Synthetic Minority Oversampling Technique (SMOTE) is a well-known oversampling technique whereby synthetic samples for the minority class are produced.SMOTE techniques aids in overcoming the overfitting issue caused by random oversampling.The technique concentrates on the feature space to create new instances by interpolating among positive instances that are close together [65].

    3.5 Adam Optimizer

    Adam [66] is a deep neural network training-specific adaptive learning rate optimization algorithm.It was firstly introduced in 2014, and it received a high attraction from a vast number of researchers due to its high performance compared to SGD or RMSprop.

    The algorithm make use of adaptive learning rate techniques to determine the learning rates for each parameter individually.Adam algorithm is extremely efficient when dealing with complex problems involving a large number of variables or records.It is reliable and needs less memory.It is a combination of the ‘gradient descent with momentum and the ‘RMSP’methods.The momentum method accelerates the gradient descent algorithm by taking the ‘exponentially weighted average’of the gradients into account.In addition, it utilises the advantages of Adagrad [67] to perform well in environments with sparse gradients, but it struggles with non-convex optimization of neural networks.It also use the advantage of Root Mean Square Propagation (RMSprop) [68]to address some of Adagrad’s shortcomings and to perform well in online settings.Utilizing averages causes this method to converge to the bare minimum more quickly.

    Hence,

    where, mtdenotes gradients aggregate at time t (present), mt-1is the aggregate of gradients at time t-1 (prior), Wtis the weights at time t, Wt+1is the weights at time t+1,αtis the learning rate at time t,?L is the derivative of loss function,?Wtis the weights derivative at time t,βis the moving average parameter.

    RMSprop is an adaptive learning method that attempts to boost AdaGrad.Rather than computing the cumulative number of squared gradients as AdaGrad does, it computes an ‘exponential moving average’.

    Therefore,

    where, Wtis the weights at timet, Wt+1is the weights at time t+1,αtis the learning rate at time t,?L is the loss function derivative?Wtis the derivative of weights at time t, Vtis the sum of the square of past gradients,βis the moving average parameterεis the small positive constant.Thus,the positive/strengths attributes of RMSprop and AdaGrad techniques are inherited by Adam optimizer, which builds on them to provide a more optimized gradient descent.By taking the equations utilized in the aforementioned two techniques, we get the final representation of Adam optimizer as follows:

    where,β1andβ2are the average decay rates of gradients in the aforementioned two techniques.αis the step size parameter/learning rate (0.01)

    4 Results and Discussion

    This section summarizes the experimental findings and discusses their significance for the different approaches including DANN with Adam optimizer, DANN with SGD optimizer, DANN with RMSprop optimizer, and DANN with Adamax optimizer.Tab.1 shows the experimental results for the proposed DANN model with different optimizers methods.

    4.1 Deep Autoencoder with Adam Optimizer

    As depicted in Tab.1, DANN model was tested independently without any optimization method, as shown in the column labeled “Deep autoencoder”.The achieved results are 95.91%for 10 iterations in average.To improve this result, Adam optimizer method was integrated with the proposed DANN model.As shown in the third column, the model performs better, and it was able to detect the anomaly in the dataset with an accuracy of 97.36%.Fig.4a depicts the result of anomaly detection accuracy for autoencoder neural network with Adam optimizer and Fig.4b sows the proposed model loss.

    4.2 Deep Autoencoder with RMSprop Optimizer

    Fig.5 shows the accuracy and loss function for the autoencoder neural network with RMSprop optimizer for both testing and training.Fig.5a has presented the accuracy of the proposed hybrid model and Fig.5b is presented the loss function of the proposed hybrid model with RMSprop optimizer algorithm.

    Table 1: Overall experimental results for the proposed approach with different optimizers method

    Figure 4: Accuracy and loss function results for autoencoder neural network with Adam optimizer: (a) Accuracy of the proposed hybrid model; (b) Loss function of the proposed hybrid model

    4.3 Deep Autoencoder with the Adamax Optimizer

    The vt element in the Adam update rule scales the gradient in reverse correspondingly to the?2 norm of the previous gradients (by the vt-1 term) and current gradient gt2 as presented in Eq.(9).Figs.6a and 6b show the accuracy and loss function results for deep autoencoder neural network with the Admax optimizer method.However, this approach has superbases other proposed modes with an accuracy of 99.40% and minimal loss as shown in Fig.6b.

    Figure 5: Accuracy and loss function results for deep autoencoder neural network with RMSprop optimizer algorithm: (a) Accuracy of the proposed hybrid model; (b) Loss function of the proposed hybrid model

    Figure 6: Accuracy and loss function results for deep autoencoder neural network with the Admax optimizer method: (a) Accuracy of the proposed hybrid model; (b) Loss function of the proposed hybrid model

    4.4 Performance Evaluation

    Fig.5 shows the results of training and testing accuracy and loss function for the autoencoder neural network with RMSprop optimizer.Fig.5a has presented the accuracy of the proposed hybrid model and Fig.5b is presented the loss function of the proposed hybrid model with RMSprop optimizer algorithm.

    Five measurement metrics are utilized to evaluate the performance of our experiment: Accuracy, Precision, Recall rate, F1-Score, and receiver operating characteristics (ROC).Accuracy is defined as the proportion of correctly classified samples and has the following formula:

    Precision is characterized as the proportion of those who truly belong to Category-A in all samples classified as such.In general, the higher the Precision, the lower the system’s False Alarm Rate (FAR).

    The recall rate indicates the proportion of all samples categorized as Category-A that are ultimately classified as such.The recall rate is a measure of a system’s capability to detect anomalies.The greater it is, the more anomalous traffic is correctly observed.

    The F1-score enables the combination of precision and recall into a single metric that encompasses both properties.

    TP, FP, TN, FN represent True Positive, False Positive, True Negative and False Negative,respectively.

    Accuracy is the most widely used metric for models trained using balanced datasets.This indicates the fraction of correctly estimated samples to the overall number of samples under evaluation for the model.Fig.7 shows the accuracy scores for the proposed anomaly detection models, determined from an independent test set.As depicted in Fig.7, out of five proposed models, the DANN-based Adamax optimizer model achieved an accuracy score of 99.40% followed by a 90.36% score of DANN-based Adam optimizer and DANN based objective function model.Although accuracy is a popular standard measure, it has drawbacks; mainly when there is a class imbalance in samples, it is often used along with other measures like F1 score or matthew’s correlation coefficient.

    F1-score is frequently employed in circumstances where an optimum integration of precision and recall is necessary.It is the harmonic mean of precision and recall scores of a model.Thus,the F1 score can be defined as given in Eq.(13).Fig.7 shows the F1 prediction values for anomaly detection models based on the five DANNs, which confirms the earlier performance validated using the AUC ratings.The DANN-based Adam optimizer model achieved an optimal F1 score of 0.9811, while the DANN-based Adamax optimizer model obtained second place with an F1 score of 0.9649.DANN and DANN-based SGD optimizer models showed comparable performance and achieved an F1 score of 0.9376 and 0.8823, respectively.DANN with RMSprop optimizer score was not that far from the aforementioned DANNs but earned the last place, with an F1-score of 0.8280.

    Figure 7: Precision, recall, F1-score and AUC achieved by DANN-based anomaly detection models

    A receiver operating characteristics (ROC) is a method for organizing, visualizing, and selecting classification models based on their performance [69].Additionally, it is a valuable performance evaluation measure, ROC curves are insensitive to changes in class distribution and especially useful for problems involving skewed class distributions [69].The ROC curve illuminates,in a sense, the cost-benefit analysis under evaluation of the classifier.The false positive (FP) ratio to total negative samples is defined as the false positive (FP) rate and measures the negative examples misclassified fraction as positive.This is considered a cost since any further action taken on the FP’s result is considered a waste, as it is a positive forecast.True positive rate, defined as the fraction of correctly predicted positive samples, can be considered an advantage due to the fact correctly predicted positive samples assist the classifier in resolving the examined problem more effectively.

    The proposed five models AUC values in this analysis are presented in the Legend portion of Fig.7.It is shown clearly from Fig.7 that the DANN-based Adamax optimizer model outperforms the rest of the methods in detection anomaly in a high dimensional real-life dataset, with an AUC value of 0.981.The model-based adam optimizer obtained the second-best prediction with an AUC value of 0.951.The AUC results obtained validate the earlier evaluation results indicated by the F1 score matric.

    When optimizing classification models, cross-entropy is often utilized as a loss function.Crossentropy as a loss function is extremely useful in binary classification problems that include the prediction of a class mark from one or more input variables.Our model attempts to estimate the target probability distribution Q as closely as possible.Thus, we can estimate the cross-entropy for an anomaly prediction in high dimensional data using the cross-entropy calculation given as follows:

    ? Predicted P(class0) =1-yhat

    ? Predicted P(class1) =yhat

    This implies that the model explicitly predicts the probability of class 1, while the probability of class 0 is given as one minus the expected probability.Fig.8 shows the average cross-entropy across all training data for the DANN-based adamax optimizer method, where the model has minimal function loss.Hence, this confirms that the proposed model is efficient and effective in predicting anomaly in high dimensional data.

    Figure 8: DANN+Adamax optimizer model for anomaly detection using cross-entropy as a loss function

    5 Comparison with Literature

    To detect the anomaly in high dimensionality industrial gas turbine dataset, we were unable to find any research contribution that has been evaluated, but we have compared the results with the two recently proposed approaches for anomaly detection in the high dimensional dataset [70,71]shown in Tab.2.The comparison is only shown for metrics available, but essentially, it shows the reader the promising results of the proposed DANN-based Adamax optimizer during the training process of the proposed model.The results show that the proposed method surpasses the two previous methods for detection anomaly in the high dimensional data set.

    As presented in Tab.2, the proposed detection model obtained a better result in detecting the anomaly and overcoming dimensionality’s curse without needing any complex and labor-intensive feature extraction.This is possible due to the inherent capability of DANNs to learn the taskspecific feature presentations automatically.Thus, the proposed DANN outperform the anomaly detection approach that is based on an Autoregressive Flow-based (ADAF) model [70] and the hybrid semi-supervised anomaly detection model suggested by [71].

    Table 2: Comparison of the proposed approach with related literature contributions

    6 Conclusion

    This study proposed an efficient and improved deep autoencoder based anomaly detection approach in real industrial gas turbine data set.The proposed approach aims at improving the accuracy of anomaly detection by reducing the dimensionality in the large gas turbine data.The proposed deep autoencoder neural networks (DANN) were integrated and tested with several wellknown optimization methods for the deep autoencoder training process.The proposed DANN approach was able to overcome the curse of dimensionality effectively.It evaluated based on commonly used evaluation measures to evaluate and validate the DANN models performance.The DANN-based Adamax optimization method has achieved the best performance with an accuracy of 99.40%, F1-score of 0.9649 and an AUC rate of 0.9649.At the same time, the DANN-based SGD optimization method obtained the worse performance in anomaly detection in the high dimensional dataset.

    Funding Statement:This research/paper was fully supported by Universiti Teknologi PETRONAS,under the Yayasan Universiti Teknologi PETRONAS (YUTP) Fundamental Research Grant Scheme (YUTP-015LC0-123).

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.

    热99re8久久精品国产| 我要搜黄色片| 成人三级做爰电影| 亚洲 欧美 日韩 在线 免费| 国模一区二区三区四区视频 | 高潮久久久久久久久久久不卡| 制服丝袜大香蕉在线| 久久中文看片网| 搡老岳熟女国产| 神马国产精品三级电影在线观看| 中文亚洲av片在线观看爽| 99国产精品99久久久久| 欧美日本视频| 日韩欧美在线乱码| 成人国产一区最新在线观看| 亚洲国产欧美人成| 国产成人aa在线观看| 久久久久国产一级毛片高清牌| 小蜜桃在线观看免费完整版高清| 伊人久久大香线蕉亚洲五| 色在线成人网| 男女之事视频高清在线观看| 欧美性猛交黑人性爽| 亚洲国产欧美人成| 亚洲熟妇熟女久久| 人妻夜夜爽99麻豆av| 久9热在线精品视频| 国产黄色小视频在线观看| 亚洲五月天丁香| 国模一区二区三区四区视频 | 久久久久性生活片| 国产成人精品无人区| 男女床上黄色一级片免费看| 国产精品香港三级国产av潘金莲| 久久精品国产清高在天天线| 精品乱码久久久久久99久播| 成年女人永久免费观看视频| 国产精品一区二区三区四区久久| 亚洲国产欧美人成| 午夜福利免费观看在线| 欧美日韩乱码在线| 欧美成人一区二区免费高清观看 | 亚洲国产中文字幕在线视频| 99久久综合精品五月天人人| 久久午夜综合久久蜜桃| 一级毛片女人18水好多| 在线国产一区二区在线| 欧美精品啪啪一区二区三区| 不卡一级毛片| 国产精品免费一区二区三区在线| 91av网站免费观看| 色视频www国产| 午夜日韩欧美国产| 12—13女人毛片做爰片一| 99热只有精品国产| 好男人在线观看高清免费视频| 亚洲天堂国产精品一区在线| 亚洲中文字幕一区二区三区有码在线看 | 亚洲国产精品sss在线观看| 欧美黑人欧美精品刺激| 最近最新中文字幕大全免费视频| 亚洲中文av在线| 国产亚洲欧美在线一区二区| 淫秽高清视频在线观看| 女人被狂操c到高潮| 日韩欧美在线乱码| 神马国产精品三级电影在线观看| 国产午夜精品久久久久久| 国模一区二区三区四区视频 | 色综合婷婷激情| 91九色精品人成在线观看| 国产乱人视频| 欧美性猛交╳xxx乱大交人| 久久九九热精品免费| 美女扒开内裤让男人捅视频| 精品福利观看| 亚洲美女视频黄频| 国产黄色小视频在线观看| 午夜激情福利司机影院| 国产亚洲精品一区二区www| 我的老师免费观看完整版| 亚洲专区中文字幕在线| 精品免费久久久久久久清纯| 成年女人毛片免费观看观看9| 一本精品99久久精品77| 成人永久免费在线观看视频| 嫩草影视91久久| 黄色女人牲交| 在线观看美女被高潮喷水网站 | 在线免费观看的www视频| 哪里可以看免费的av片| 好男人在线观看高清免费视频| 两人在一起打扑克的视频| 怎么达到女性高潮| 天天一区二区日本电影三级| 国产精品久久久av美女十八| 成人午夜高清在线视频| 99久久精品国产亚洲精品| av黄色大香蕉| 日韩欧美 国产精品| 欧美一区二区国产精品久久精品| 国产单亲对白刺激| 亚洲专区国产一区二区| 五月玫瑰六月丁香| 老司机午夜十八禁免费视频| h日本视频在线播放| 又粗又爽又猛毛片免费看| 美女黄网站色视频| 日本 欧美在线| 狂野欧美激情性xxxx| 亚洲国产日韩欧美精品在线观看 | 日韩欧美精品v在线| 免费大片18禁| 成人亚洲精品av一区二区| 老司机在亚洲福利影院| 黄色女人牲交| 亚洲精品久久国产高清桃花| 夜夜躁狠狠躁天天躁| 在线观看免费视频日本深夜| 草草在线视频免费看| 午夜免费激情av| 国产精品永久免费网站| 一二三四在线观看免费中文在| www日本黄色视频网| 极品教师在线免费播放| 99久久国产精品久久久| 亚洲午夜精品一区,二区,三区| 国产男靠女视频免费网站| av国产免费在线观看| 中文字幕熟女人妻在线| 一本一本综合久久| 高潮久久久久久久久久久不卡| 欧美三级亚洲精品| 成年女人永久免费观看视频| 精品无人区乱码1区二区| 免费一级毛片在线播放高清视频| 国产不卡一卡二| 性色avwww在线观看| 看免费av毛片| av中文乱码字幕在线| 男女之事视频高清在线观看| 中文字幕最新亚洲高清| 久久久久亚洲av毛片大全| 亚洲精品中文字幕一二三四区| 成人18禁在线播放| 日韩精品青青久久久久久| 精品国产乱码久久久久久男人| 美女扒开内裤让男人捅视频| 日韩免费av在线播放| 成在线人永久免费视频| 精品久久久久久久久久久久久| 国产黄片美女视频| 国产蜜桃级精品一区二区三区| 最近视频中文字幕2019在线8| 18禁黄网站禁片午夜丰满| 国产伦一二天堂av在线观看| 黄片小视频在线播放| 久久亚洲真实| 国内精品久久久久精免费| 国产午夜福利久久久久久| 亚洲片人在线观看| 国产亚洲av嫩草精品影院| 制服人妻中文乱码| 国产高潮美女av| 亚洲精品国产精品久久久不卡| 国产真实乱freesex| 香蕉久久夜色| 亚洲18禁久久av| 国产成人精品久久二区二区91| 最近最新中文字幕大全电影3| 国内精品一区二区在线观看| 国产高清激情床上av| 18禁美女被吸乳视频| 午夜激情欧美在线| 日韩有码中文字幕| 一级作爱视频免费观看| 在线观看午夜福利视频| 中国美女看黄片| 日韩有码中文字幕| 国产 一区 欧美 日韩| 中文字幕高清在线视频| 国产黄片美女视频| 亚洲熟女毛片儿| 91av网一区二区| 久久久国产欧美日韩av| 综合色av麻豆| 国产精品久久视频播放| 婷婷丁香在线五月| 一级毛片女人18水好多| 国内精品一区二区在线观看| 美女免费视频网站| 变态另类丝袜制服| 久久中文字幕人妻熟女| 亚洲精华国产精华精| 俺也久久电影网| 老司机福利观看| 亚洲av电影不卡..在线观看| 观看免费一级毛片| 国产精品99久久久久久久久| 99国产极品粉嫩在线观看| 日韩欧美三级三区| 日本a在线网址| 在线观看午夜福利视频| 成年人黄色毛片网站| 嫁个100分男人电影在线观看| 精华霜和精华液先用哪个| 此物有八面人人有两片| 日韩 欧美 亚洲 中文字幕| 亚洲精品国产精品久久久不卡| 国产精品爽爽va在线观看网站| 亚洲av熟女| 在线播放国产精品三级| 免费在线观看日本一区| 欧美+亚洲+日韩+国产| 亚洲欧美日韩卡通动漫| 99精品在免费线老司机午夜| 91麻豆精品激情在线观看国产| 成人无遮挡网站| 中文字幕高清在线视频| 欧美黑人欧美精品刺激| 少妇的丰满在线观看| 亚洲欧美日韩东京热| 日韩 欧美 亚洲 中文字幕| 成人特级av手机在线观看| 欧美日韩乱码在线| 一本精品99久久精品77| 黄色 视频免费看| 亚洲性夜色夜夜综合| 中文在线观看免费www的网站| 国产免费男女视频| 国产免费av片在线观看野外av| 每晚都被弄得嗷嗷叫到高潮| 69av精品久久久久久| 国内毛片毛片毛片毛片毛片| www.熟女人妻精品国产| 天天一区二区日本电影三级| 国产欧美日韩一区二区三| 嫁个100分男人电影在线观看| 久久久精品大字幕| 日本 av在线| 欧美xxxx黑人xx丫x性爽| 精品久久久久久成人av| 男人舔女人的私密视频| 国语自产精品视频在线第100页| 美女午夜性视频免费| 男女床上黄色一级片免费看| 欧美在线一区亚洲| 我要搜黄色片| 亚洲成人久久爱视频| 亚洲男人的天堂狠狠| 免费看日本二区| 蜜桃久久精品国产亚洲av| 淫秽高清视频在线观看| 天天躁狠狠躁夜夜躁狠狠躁| 九色成人免费人妻av| 18禁黄网站禁片免费观看直播| 在线视频色国产色| 淫秽高清视频在线观看| 男女之事视频高清在线观看| 搡老岳熟女国产| 亚洲狠狠婷婷综合久久图片| 国产亚洲av高清不卡| 国产熟女xx| 色老头精品视频在线观看| 国产乱人伦免费视频| 色av中文字幕| 亚洲国产精品成人综合色| 黄频高清免费视频| 国产成人精品无人区| 97超视频在线观看视频| 亚洲av第一区精品v没综合| 一进一出抽搐动态| 变态另类丝袜制服| 精品欧美国产一区二区三| 欧美日韩一级在线毛片| 日韩欧美在线乱码| 一级毛片高清免费大全| 国产爱豆传媒在线观看| 亚洲国产中文字幕在线视频| 十八禁网站免费在线| 免费人成视频x8x8入口观看| 日本a在线网址| 啪啪无遮挡十八禁网站| 国产精品久久久久久人妻精品电影| 丰满人妻一区二区三区视频av | 亚洲电影在线观看av| 国产99白浆流出| 曰老女人黄片| 久久国产乱子伦精品免费另类| 亚洲熟女毛片儿| 一本精品99久久精品77| 久久久国产精品麻豆| 精品99又大又爽又粗少妇毛片 | 欧美绝顶高潮抽搐喷水| 欧美精品啪啪一区二区三区| 9191精品国产免费久久| 老汉色∧v一级毛片| 不卡av一区二区三区| 国产在线精品亚洲第一网站| 99久久99久久久精品蜜桃| 久久久久久人人人人人| 精品福利观看| 韩国av一区二区三区四区| 精品熟女少妇八av免费久了| 亚洲av电影不卡..在线观看| 小蜜桃在线观看免费完整版高清| 国内毛片毛片毛片毛片毛片| 亚洲九九香蕉| 亚洲色图 男人天堂 中文字幕| 91久久精品国产一区二区成人 | 国产成人av教育| 1024香蕉在线观看| 18禁美女被吸乳视频| 他把我摸到了高潮在线观看| 日本免费a在线| 一a级毛片在线观看| 天天添夜夜摸| 国产精品av久久久久免费| 18美女黄网站色大片免费观看| av黄色大香蕉| 免费在线观看亚洲国产| 日韩国内少妇激情av| 美女高潮的动态| 看黄色毛片网站| 亚洲电影在线观看av| 日日摸夜夜添夜夜添小说| 久久国产精品影院| 少妇人妻一区二区三区视频| 18禁裸乳无遮挡免费网站照片| 成人一区二区视频在线观看| 亚洲精品一卡2卡三卡4卡5卡| 精品久久久久久久人妻蜜臀av| 少妇人妻一区二区三区视频| 最好的美女福利视频网| 老司机福利观看| 毛片女人毛片| x7x7x7水蜜桃| 国产真实乱freesex| 最新在线观看一区二区三区| 黄色丝袜av网址大全| 99热这里只有是精品50| www.精华液| 真人一进一出gif抽搐免费| 日日干狠狠操夜夜爽| 久久久久久久午夜电影| 床上黄色一级片| 哪里可以看免费的av片| 欧美黄色片欧美黄色片| 日韩欧美三级三区| 18美女黄网站色大片免费观看| 在线免费观看不下载黄p国产 | 久久欧美精品欧美久久欧美| 欧美大码av| 欧美xxxx黑人xx丫x性爽| 亚洲精品色激情综合| 一区福利在线观看| av天堂中文字幕网| 黑人欧美特级aaaaaa片| 亚洲va日本ⅴa欧美va伊人久久| 亚洲aⅴ乱码一区二区在线播放| 国内久久婷婷六月综合欲色啪| 一个人看的www免费观看视频| 又粗又爽又猛毛片免费看| 床上黄色一级片| 亚洲国产欧美一区二区综合| 在线播放国产精品三级| 国产av在哪里看| 久久久精品欧美日韩精品| 国产单亲对白刺激| 亚洲欧美日韩东京热| 黄色日韩在线| 免费电影在线观看免费观看| 一进一出抽搐gif免费好疼| 国产私拍福利视频在线观看| 18禁裸乳无遮挡免费网站照片| 亚洲精品456在线播放app | 村上凉子中文字幕在线| 男女做爰动态图高潮gif福利片| 宅男免费午夜| 香蕉丝袜av| 两性夫妻黄色片| 精品久久蜜臀av无| www日本在线高清视频| 女人高潮潮喷娇喘18禁视频| www.999成人在线观看| 一级作爱视频免费观看| 亚洲国产色片| 免费观看精品视频网站| 欧美成狂野欧美在线观看| 这个男人来自地球电影免费观看| 在线观看午夜福利视频| 叶爱在线成人免费视频播放| 麻豆av在线久日| 女人被狂操c到高潮| 精品欧美国产一区二区三| 91av网一区二区| 两性午夜刺激爽爽歪歪视频在线观看| 白带黄色成豆腐渣| 欧美又色又爽又黄视频| 免费看a级黄色片| 看免费av毛片| 欧美乱色亚洲激情| 男人舔奶头视频| 亚洲激情在线av| 国产成人精品久久二区二区91| 午夜精品久久久久久毛片777| 18禁黄网站禁片午夜丰满| 波多野结衣巨乳人妻| 久久久久久九九精品二区国产| 欧美黑人欧美精品刺激| 久久久久久九九精品二区国产| 亚洲av美国av| 国产伦人伦偷精品视频| xxxwww97欧美| 无限看片的www在线观看| 国产男靠女视频免费网站| av中文乱码字幕在线| 热99在线观看视频| 国产一区二区三区在线臀色熟女| 精品日产1卡2卡| 变态另类成人亚洲欧美熟女| 免费看美女性在线毛片视频| 国产高清有码在线观看视频| 好看av亚洲va欧美ⅴa在| 观看美女的网站| xxx96com| 最好的美女福利视频网| 12—13女人毛片做爰片一| 一本综合久久免费| 最近在线观看免费完整版| 丰满人妻熟妇乱又伦精品不卡| 久久午夜综合久久蜜桃| 色综合婷婷激情| 国产成人一区二区三区免费视频网站| 真人一进一出gif抽搐免费| 日本一二三区视频观看| 日韩有码中文字幕| 99国产精品99久久久久| 中文亚洲av片在线观看爽| 国产野战对白在线观看| 久久久成人免费电影| 成人无遮挡网站| 亚洲最大成人中文| 制服丝袜大香蕉在线| 真人做人爱边吃奶动态| 麻豆国产97在线/欧美| 久久热在线av| 色吧在线观看| 国产成人影院久久av| 久久精品aⅴ一区二区三区四区| 亚洲国产欧美一区二区综合| 欧美日韩国产亚洲二区| 婷婷六月久久综合丁香| 国产精品 欧美亚洲| 91av网一区二区| tocl精华| 亚洲成人免费电影在线观看| 午夜免费观看网址| 亚洲avbb在线观看| 国产午夜福利久久久久久| 啪啪无遮挡十八禁网站| 亚洲av五月六月丁香网| 一夜夜www| 9191精品国产免费久久| 老汉色av国产亚洲站长工具| 亚洲在线自拍视频| www日本黄色视频网| 一个人观看的视频www高清免费观看 | 久久精品91无色码中文字幕| av片东京热男人的天堂| 又粗又爽又猛毛片免费看| 免费av不卡在线播放| 国产三级黄色录像| 国产亚洲精品一区二区www| av欧美777| 精品久久蜜臀av无| 久久精品91蜜桃| av女优亚洲男人天堂 | 国产一区二区三区在线臀色熟女| 久久久久国内视频| 亚洲中文日韩欧美视频| 老司机在亚洲福利影院| 色播亚洲综合网| 啦啦啦观看免费观看视频高清| 丰满人妻一区二区三区视频av | av片东京热男人的天堂| 免费看a级黄色片| 亚洲 国产 在线| 成人性生交大片免费视频hd| 久久午夜综合久久蜜桃| 日本 av在线| 国产精品爽爽va在线观看网站| 黑人操中国人逼视频| av视频在线观看入口| 熟妇人妻久久中文字幕3abv| a在线观看视频网站| 老司机深夜福利视频在线观看| 欧美成人性av电影在线观看| 一级作爱视频免费观看| 午夜激情福利司机影院| av欧美777| 欧美最黄视频在线播放免费| 国产又色又爽无遮挡免费看| 欧洲精品卡2卡3卡4卡5卡区| 一级a爱片免费观看的视频| 无人区码免费观看不卡| 在线观看66精品国产| 观看美女的网站| 全区人妻精品视频| 免费观看人在逋| www日本在线高清视频| 日韩三级视频一区二区三区| xxx96com| 88av欧美| 日韩国内少妇激情av| 波多野结衣高清无吗| 首页视频小说图片口味搜索| 999久久久精品免费观看国产| 别揉我奶头~嗯~啊~动态视频| 身体一侧抽搐| x7x7x7水蜜桃| 亚洲人成网站在线播放欧美日韩| 日本一二三区视频观看| 村上凉子中文字幕在线| 精品久久蜜臀av无| 亚洲av成人av| 99热这里只有是精品50| 国产精品日韩av在线免费观看| 精品国内亚洲2022精品成人| а√天堂www在线а√下载| 老司机深夜福利视频在线观看| 成人午夜高清在线视频| 窝窝影院91人妻| 九九久久精品国产亚洲av麻豆 | 国产高清视频在线播放一区| a级毛片a级免费在线| 在线观看66精品国产| 日日干狠狠操夜夜爽| 成人av一区二区三区在线看| 可以在线观看的亚洲视频| 白带黄色成豆腐渣| 中国美女看黄片| 中文亚洲av片在线观看爽| 国产97色在线日韩免费| 一区福利在线观看| 校园春色视频在线观看| 久久久成人免费电影| 亚洲avbb在线观看| 国产亚洲精品久久久久久毛片| 国产成人av教育| 国产av在哪里看| 成人特级av手机在线观看| 国产亚洲精品av在线| 免费av毛片视频| 19禁男女啪啪无遮挡网站| 国产亚洲精品久久久com| 九九在线视频观看精品| a在线观看视频网站| 我的老师免费观看完整版| 在线观看免费午夜福利视频| 黑人操中国人逼视频| 午夜免费激情av| 不卡一级毛片| 成熟少妇高潮喷水视频| 日韩成人在线观看一区二区三区| 国内少妇人妻偷人精品xxx网站 | 不卡一级毛片| 麻豆av在线久日| 国产精品一区二区精品视频观看| 久久天堂一区二区三区四区| 看免费av毛片| 日韩三级视频一区二区三区| 熟妇人妻久久中文字幕3abv| 久久亚洲真实| 狂野欧美白嫩少妇大欣赏| 岛国在线观看网站| 午夜激情欧美在线| 成年版毛片免费区| 他把我摸到了高潮在线观看| 日日夜夜操网爽| 午夜免费观看网址| 老汉色av国产亚洲站长工具| 一本精品99久久精品77| 三级男女做爰猛烈吃奶摸视频| 亚洲精品久久国产高清桃花| 成人三级黄色视频| 国产精品日韩av在线免费观看| 99视频精品全部免费 在线 | 最近在线观看免费完整版| 两人在一起打扑克的视频| 啦啦啦免费观看视频1| 亚洲男人的天堂狠狠| 欧美黄色片欧美黄色片| 老司机午夜十八禁免费视频| 可以在线观看毛片的网站| 日日干狠狠操夜夜爽| 手机成人av网站| 18禁美女被吸乳视频| 亚洲人与动物交配视频| 免费看光身美女| 99久久无色码亚洲精品果冻| 国产精品99久久久久久久久| 岛国在线免费视频观看| 欧美日韩黄片免| 午夜精品久久久久久毛片777| 一级黄色大片毛片| 51午夜福利影视在线观看| 免费在线观看视频国产中文字幕亚洲| 欧美性猛交黑人性爽| 欧美大码av| 99久久精品一区二区三区| 亚洲欧美日韩卡通动漫| 少妇的丰满在线观看| 热99在线观看视频| 免费一级毛片在线播放高清视频|