• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A Hybrid Deep Learning-Based Unsupervised Anomaly Detection in High Dimensional Data

    2022-03-14 09:25:08AmgadMuneerShakirahMohdTaibSulimanMohamedFatiAbdullateefBalogunandIzzatdinAbdulAziz
    Computers Materials&Continua 2022年3期

    Amgad Muneer,Shakirah Mohd Taib,Suliman Mohamed Fati,Abdullateef O.Balogun and Izzatdin Abdul Aziz

    1Department of Computer and Information Sciences,Universiti Teknologi PETRONAS,Seri Iskandar,32160,Malaysia

    2Centre for Research in Data Science(CERDAS),Universiti Teknologi PETRONAS,Seri Iskandar,32610,Perak,Malaysia

    3Information Systems Department,Prince Sultan University,Riyadh,11586,Saudi Arabia

    Abstract: Anomaly detection in high dimensional data is a critical research issue with serious implication in the real-world problems.Many issues in this field still unsolved, so several modern anomaly detection methods struggle to maintain adequate accuracy due to the highly descriptive nature of big data.Such a phenomenon is referred to as the“curse of dimensionality”that affects traditional techniques in terms of both accuracy and performance.Thus, this research proposed a hybrid model based on Deep Autoencoder Neural Network(DANN)with five layers to reduce the difference between the input and output.The proposed model was applied to a real-world gas turbine(GT)dataset that contains 87620columns and 56 rows.During the experiment,two issues have been investigated and solved to enhance the results.The first is the dataset class imbalance,which solved using SMOTE technique.The second issue is the poor performance,which can be solved using one of the optimization algorithms.Several optimization algorithms have been investigated and tested, including stochastic gradient descent (SGD), RMSprop, Adam and Adamax.However, Adamax optimization algorithm showed the best results when employed to train the DANN model.The experimental results show that our proposed model can detect the anomalies by efficiently reducing the high dimensionality of dataset with accuracy of 99.40%,F1-score of 0.9649,Area Under the Curve (AUC) rate of 0.9649,and a minimal loss function during the hybrid model training.

    Keywords: Anomaly detection; outlier detection; unsupervised learning;autoencoder; deep learning; hybrid model

    1 Introduction

    Nowadays, a huge amount of data is produced periodically at an unparalleled speed from diverse and composite origins such as social media, sensors, telecommunication, financial transactions, etc.[1,2].Such acceleration in data generation yields to the concept of big data, which can be attributed to the umpteen and dynamism in technological advancements.For instance, the emergence of the Internet of Things (IoT) and the increase in smart devices usages (wearables and non-wearables) have contributed to the upsurge in the continuous generation of data [3].As defined by Gandomi et al.[4], big data can be described as high-volume, high-velocity, and highvariety datasets where knowledge or insights can be derived using data analytic tools.Moreover,big data is conceptualized as the 5 Vs (Value, Veracity, Variety, Velocity and Volume) [5].As shown in Fig.1, Value depicts the advantage of data analysis; veracity shows the level of accuracy while Variety represents the different kinds of data (structured, semi-structured, and unstructured)present in big data [6].Concerning Volume, it shows the magnitude of data being processed or stored.However, an increment in the volume of data leads to an increase in the dimensionality of such data.Dimensionality is the number of features or attributes present in each dataset.On the other hand, Velocity represents the rate at which data are produced which may consist of several dimensions.The preceding statements showed how the 5 Vs of big data addresses its underlining limitations [7].Nonetheless, the dimensionality of data, which is proportional to the volume of the data is somewhat overlooked.Increment or large dimensions of data could negatively affect the extraction of knowledge from a dataset.That is, high dimensionality can affect data analytics such as anomaly detection in a large dataset.

    Figure 1: High dimensionality problem in big data [5]

    Anomaly detection points to the challenge of detecting trends in data that do not correspond to anticipated behavior [8].In various implementation domains, these non-conforming patterns are referred to as deviations, outliers, discordant observations, variations, aberrations, shocks,peculiarities, or pollutants [9,10].The existence of anomalies in each dataset can be seen as a data quality problem as it can lead to undesired outcomes if not removed [11,12].As such, the removal of anomalous points from a dataset leads to data quality improvement, which makes the given dataset an imperative [13].Besides, closeness of the data objects to one another yields to the high dimensionality in datasets, which will lead to the ambiguity in the respective data distances [14].Although there are several detection techniques which require sophisticated and efficient computational approaches [8,15], the conventional anomaly detection techniques cannot adequately handle or address the high-dimensionality issue.Besides, many of these conventional anomaly detection techniques infer that the data have uniform attributes or features.On the contrary, real-life datasets in most cases have diverse types of attributes.This observation points to a heightened problem in anomaly detection [5,8,15].

    Several anomaly detection techniques have been proposed across different application domains [14-18].Neighbor-based anomaly techniques such as LOF, kNNW, ODIN detects anomalies using neighborhood information from data points [19-21].These methods record poor performance and are somewhat conscious of the parameters of proposed methods.Another instance is the recent ensemble-based anomaly detection techniques.Zimek et al.[22] and Pasillas-Díaz et al.[23] proposed an ensemble-based anomaly detection method with good performance.However, ensemble methods are a black-box mechanism that lacks explain ability.Besides, selecting the most applicable and appropriate meta-learner is an open research problem.Another example, Wilkinson [24] proposed an unsupervised algorithm known as HDoutliers that can detect anomalies in high-dimensional datasets.Findings from the comparative study by Talagala et al.[25] also corroborate the efficiency of the HDoutliers algorithm.However, the tendency of HDoutliers to increase the false negatives rate is its drawback.Chalapathy et al.[26], Wu et al.[27], and Favarelli et al.[28], in their respective studies, proposed One-Class Neural Network anomaly detection methods for small and large-scale datasets.Also, Malhotra et al.[29], Nguyen et al.[30], Zhou et al.[17], and Said Elsayed et al.[31] developed anomaly detection methods based on long short-term memory (LSTM).However, these existing methods cannot handle the class imbalance problem.

    Instigated by the preceding problems, this study proposes a novel hybrid deep learningbased approach for anomaly detection in large-scale datasets.Specifically, a data sampling method and multi-layer deep autoencoder with Adamax optimization algorithm is proposed.Synthetic Minority Over-sampling Technique (SMOTE) is used as a data sampling method to resolve the inherent class imbalance problem by augmenting the number of minority class instances to the level of the majority class label.A novel deep autoencoder neural network (DANN) with Adamax optimization algorithm is used for detecting anomaly and reducing dimensionality.The primary contributions of this work are summarized as thus:

    ? A novel DANN approach to detect anomalies in time series by the unsupervised mode.

    ? Hybridization of SMOTE data sampling and DANN to overcome inherent class imbalance problem.

    ? We addressed and overcame the curse of dimensionality in data by applying a multilayer autoencoder model that can find optimal parameter values and minimize the difference between the input and the output using deep reconstruction error during the model training.

    The rest of this paper is structured as follows.Section 2 highlighted the background and the related works of the study.Section 3 outlines this work’s research methodology, while Section 4 describes the experimental findings.Lastly, Section 5 concludes the paper and highlights the future work.

    2 Background and Related Work

    Anomaly detection is a well-known issue in a variety of fields, so different approaches have been proposed to mitigate this issue recently.Further information about this issue can be found in [5,32-35].In this section, we will look at some of the more common anomaly detection techniques, and the relevant enhancements.

    One of the commonly used anomaly detection technique is called neighbor-based anomaly detection technique whereby the outliers are identified based on the neighborhood information.Thus, the anomaly is scored as the average or weighted distance between the data object and its k nearest neighbors [19,21].Another option is using the local outlier factor (LOF) to determine the anomaly degree whereby the anomaly score is calculated in accordance with its neighborhood [36].Likewise, Hautamaki et al.[20] proposed an Outlier Detection using Indegree Number (ODIN),which is based on kNN graph, whereby data instances are segregated based on their respective influence in its neighborhood.It is worth mentioning that all the above-mentioned neighbor-based detection methods are independent of data distributions and can detect isolated entities.However,their success is heavily reliant on distance scales, which is unreliable or insignificant in the highdimensional spaces.Thus, considering the ranking of neighbors is a viable solution to overcome this issue as the existence of high-dimensional data still makes the ranking of each object’s nearest neighbors significant.The underlying assumption is that if the same process created two objects,they would most likely become nearest neighbors or have similar neighbors [37].

    Another applicable approach is deploying the subspace learning method.Sub-space-based anomaly detection approaches try to locate anomalies by sifting through various subsets of dimensions in an orderly manner.According to Zimek et al.[22], only a subset of relevant features for an object in a high dimensional space provides useful information, while the rest are unrelated to the mission.The presence of irrelevant features can make the anomaly detection process challenging to distinct.Another direction is using sparse subspace technique, which is a kind of subspace technique.Both points in a high-dimensional space are projected onto one or more low-dimensional, called sparse subspaces in this case [38,39].As a result, objects that collapse into sparse subspaces are considered anomalies due to their abnormally low densities.It should be noted, however, such examination of the feature vectors from the whole highdimensional space takes time [38,40].Therefore, to improve exploration results, Aggarwal et al.[41]used an evolutionary algorithm, whereby a space projection was described as a subspace with the most negative scarcity coefficients.However, certain factors, such as the original species, fitness functions, and selection processes, have a substantial effect on the effects of the evolutionary algorithm.The disadvantage of this method seems to be relaying on a large amount of data to identify the variance trend.

    Ensemble learning is another feasible anomaly detection approach.This can be attributed to it efficiency over baseline methods [22,42,43].Specifically, feature bagging and subsampling have been deployed in aggregating anomaly scores and pick the optimal value.For instance, Lazarevic et al.[44], in their study randomly selects feature samples from the initial feature space.An anomaly detection algorithm is then used to approximate the score of each item on each function subset.The scores for the same item are then added together to form the final score.On the other hand, Nguyen et al.[45] estimated anomaly scores for objects on random subspaces using multiple detection methods rather than the same one.Similarly, Keller et al.[46] suggested a modular anomaly detection approach that splits the anomaly mining mechanism into two parts, subspace search and anomaly ranking.Using the Monte Carlo sampling method, the subspace scan aims to obtain high contrast subspaces (HiCS), and then the LOF scores of artefacts are aggregated on the obtained subspaces.Van Stein et al.[40] took it a step further by accumulating similar HiCS subspaces and then measuring the anomaly scores of entities using local anomaly probabilities in the global feature space.For instance, Zimek et al.[22] used the random subsampling method to find each object’s closest neighbours and then estimate its local density.This ensemble approach,when used in conjunction with an anomaly detection algorithm, is more efficient and yields a more complex range of data.There are several approaches for detecting anomalies that consider both attribute bagging and subsampling.Pasillas-Díaz et al.[23], for example, used bagging to collect various features at each iteration and then used subsampling to measure anomaly scores for different subsets of data.Using bagging ensemble, however, it is difficult to achieve entity variation, and the results are vulnerable to the scale of subsampled datasets.It is important to note that most of the above-mentioned anomaly detection methods can only process numerical data, leading to low efficacy.Moreover, most of preceding studies failed to investigate the concept of class imbalance that is an inherent problem in machine learning and present in most datasets.Thus, this proposed study proposes a novel hybrid deep learning-based approach for anomaly detection in large-scale datasets.

    3 Materials and Methods

    This section describes the gas turbine (GT) dataset, the real-world data utilized for anomalies detection in the high-dimensional dataset.It discussed the various techniques used for dimensionality reduction and features optimization and the different stages of the proposed hybrid model.

    3.1 Dataset Description

    The dataset used in this research is real industry high dimensional data for a gas turbine.This data contains 87620 columns and 56 rows.In this study, the data has been splits into a training set and testing set with a ratio of 60:40.Detecting anomalies in real-world high-dimensional data is a theoretical and a practical challenge due to “curse of dimensionality” issue, which is widely discussed in the literature [47,48].Therefore, we have utilized a deep autoencoder algorithm composed of two symmetrical deep belief networks comprised of four shallow layers.Between them, half of the network is responsible for encoding, and the other half is responsible for decoding.The autoencoder learns significant present features in the data through minimizing the reconstruction error between the input and output data.Particularly, the high-dimensional noisy data is a common, so the first step is to reduce the dimension of the data.During this process,the data are projected on a space of lower dimension, thus the noise is eliminated and only the essential information is preserved.Accordingly, Deep Autoencoder Neural Network (DANN)Algorithm is used in this paper to reduce the data noise.

    3.2 The Proposed Deep Autoencoder Neural Network(DANN)Algorithm

    An autoencoder is a particular type of artificial neural network utilized primarily for handling tasks of unsupervised machine learning [49-51].Like the works in [52-56], this study utilizes the autoencoders for both dimensionality reduction and detecting anomalies.Autoencoder is composed of two components: an encoder and a decoder.The encoder’s output is a compressed representation of the input pattern described in terms of a vector function.First, the autoencoder learns the data presentation (encoding) of data set through the process of network training to ignore the “noise”.The goal of this process is to reduce dimensionality [57,58].Second,the autoencoder tries to produce a compressed representation, which is as close as possible to its original input, from the reduced encoding.As depicted in Fig.2, the input, mapping, and bottleneck layers of the DANN estimate the mapping functions that bring the original data into the main component space of the lower dimension [59], whereas the demapping and output layers estimated the demapping functions that carry the original data space back to the projected data.

    Figure 2: Architecture of deep autoencoder neural network (DANN)

    The proposed DANN has the following mathematical model form:

    wherexdonates the input vectors,ymdonates the mapping layer,tdonates the bottleneck layer,yddonates the demapping layer, andx?represent the output layer.b and W are bias vectors and weight matrices, correspondingly.Besides,adenotes the non-linear activation function.Fig.2 summarizes the dimensions of both the matrices and vectors.The objective of auto-associative neural network training is to determine optimal parameter values (i.e., “optimal values ofWandb”) that reduce the input and output differences, it can be computed as given in Eq.(2):

    which is also called the reconstruction error.

    3.3 Objective Functions for Autoencoder Neural Network Training

    Apart from the reconstruction error specified in Eq.(2), three objective functions can be used to train autoencoder neural networks.We describe two alternative objective functions in this section: hierarchical error and denoising criterion.The authors in [60] proposed the concept of hierarchical error to establish a hierarchy (i.e., relative importance) amongst non-linear principal components Analysis (PCA), which is utilizing the reconstruction error as the objective function [61].Thus, it demonstrated that maximization of the principal variable variance is equal to the residual variance minimization in linear PCA.Accordingly, the reconstruction of hierarchical error could be described as:

    The authors in [62] suggested the denoising criterion to derive more stable principal components.To employ the denoising criterion, the corrupted inputxis produced by adding noise to the original inputx, like masking noise and Gaussian noise.Subsequently, the autoencoder neural network is trained to retrieve the original input using the corrupted data as input.Finally, the denoising criterion was used to demonstrate the ability of autoencoder neural networks to learn a lower-dimensional manifold.Thus, it will capture more essential patterns in the original data.Fig.3 summarizes the three-goal functions schematically.

    Figure 3: Different objective functions are depicted schematically: (a) Reconstruction error; (b)Hierarchical error; and (c) Denoising criterion [61]

    Based on the above, we have designed a similar procedure for dimensionality reduction utilizing DANN model.First, the matrix of the original data is partitioned into two split sets,to contain only the usual operating data.One set for the training purpose and another for DANN model testing.Second, the autoencoder neural network is trained to make use of the training dataset.Once it is trained, the autoencoder neural network start computing the principal components and residuals by feeding a new data sample.This is followed by determining the T2and Q statistics as follows:

    where tkdenotes the principal component value of kthin the latest sample of data, andσkdenotes the kthprincipal component standard deviation as determined from the training dataset.It is a worth mention that the upper control limits were set with assuming the compliance of the data with a multivariate normal distribution.Thus, a different approach was followed in this work by calculating the upper control limits for two statistics directly from the given large dataset without assuming any possible distribution form.For instance, with a hundred samples of normal training data, the next biggest T2(or Q) value is chosen as the upper control limit to attain a false alarm rate of 0.01.

    3.4 Synthetic Minority Oversampling Technique(SMOTE)

    Resampling the data, including undersampling and oversampling, is one of the prominent approaches to relieve ths issue of imbalanced dataset [63].Oversampling techniques are preferable over undersampling techniques in most circumstances [64].Synthetic Minority Oversampling Technique (SMOTE) is a well-known oversampling technique whereby synthetic samples for the minority class are produced.SMOTE techniques aids in overcoming the overfitting issue caused by random oversampling.The technique concentrates on the feature space to create new instances by interpolating among positive instances that are close together [65].

    3.5 Adam Optimizer

    Adam [66] is a deep neural network training-specific adaptive learning rate optimization algorithm.It was firstly introduced in 2014, and it received a high attraction from a vast number of researchers due to its high performance compared to SGD or RMSprop.

    The algorithm make use of adaptive learning rate techniques to determine the learning rates for each parameter individually.Adam algorithm is extremely efficient when dealing with complex problems involving a large number of variables or records.It is reliable and needs less memory.It is a combination of the ‘gradient descent with momentum and the ‘RMSP’methods.The momentum method accelerates the gradient descent algorithm by taking the ‘exponentially weighted average’of the gradients into account.In addition, it utilises the advantages of Adagrad [67] to perform well in environments with sparse gradients, but it struggles with non-convex optimization of neural networks.It also use the advantage of Root Mean Square Propagation (RMSprop) [68]to address some of Adagrad’s shortcomings and to perform well in online settings.Utilizing averages causes this method to converge to the bare minimum more quickly.

    Hence,

    where, mtdenotes gradients aggregate at time t (present), mt-1is the aggregate of gradients at time t-1 (prior), Wtis the weights at time t, Wt+1is the weights at time t+1,αtis the learning rate at time t,?L is the derivative of loss function,?Wtis the weights derivative at time t,βis the moving average parameter.

    RMSprop is an adaptive learning method that attempts to boost AdaGrad.Rather than computing the cumulative number of squared gradients as AdaGrad does, it computes an ‘exponential moving average’.

    Therefore,

    where, Wtis the weights at timet, Wt+1is the weights at time t+1,αtis the learning rate at time t,?L is the loss function derivative?Wtis the derivative of weights at time t, Vtis the sum of the square of past gradients,βis the moving average parameterεis the small positive constant.Thus,the positive/strengths attributes of RMSprop and AdaGrad techniques are inherited by Adam optimizer, which builds on them to provide a more optimized gradient descent.By taking the equations utilized in the aforementioned two techniques, we get the final representation of Adam optimizer as follows:

    where,β1andβ2are the average decay rates of gradients in the aforementioned two techniques.αis the step size parameter/learning rate (0.01)

    4 Results and Discussion

    This section summarizes the experimental findings and discusses their significance for the different approaches including DANN with Adam optimizer, DANN with SGD optimizer, DANN with RMSprop optimizer, and DANN with Adamax optimizer.Tab.1 shows the experimental results for the proposed DANN model with different optimizers methods.

    4.1 Deep Autoencoder with Adam Optimizer

    As depicted in Tab.1, DANN model was tested independently without any optimization method, as shown in the column labeled “Deep autoencoder”.The achieved results are 95.91%for 10 iterations in average.To improve this result, Adam optimizer method was integrated with the proposed DANN model.As shown in the third column, the model performs better, and it was able to detect the anomaly in the dataset with an accuracy of 97.36%.Fig.4a depicts the result of anomaly detection accuracy for autoencoder neural network with Adam optimizer and Fig.4b sows the proposed model loss.

    4.2 Deep Autoencoder with RMSprop Optimizer

    Fig.5 shows the accuracy and loss function for the autoencoder neural network with RMSprop optimizer for both testing and training.Fig.5a has presented the accuracy of the proposed hybrid model and Fig.5b is presented the loss function of the proposed hybrid model with RMSprop optimizer algorithm.

    Table 1: Overall experimental results for the proposed approach with different optimizers method

    Figure 4: Accuracy and loss function results for autoencoder neural network with Adam optimizer: (a) Accuracy of the proposed hybrid model; (b) Loss function of the proposed hybrid model

    4.3 Deep Autoencoder with the Adamax Optimizer

    The vt element in the Adam update rule scales the gradient in reverse correspondingly to the?2 norm of the previous gradients (by the vt-1 term) and current gradient gt2 as presented in Eq.(9).Figs.6a and 6b show the accuracy and loss function results for deep autoencoder neural network with the Admax optimizer method.However, this approach has superbases other proposed modes with an accuracy of 99.40% and minimal loss as shown in Fig.6b.

    Figure 5: Accuracy and loss function results for deep autoencoder neural network with RMSprop optimizer algorithm: (a) Accuracy of the proposed hybrid model; (b) Loss function of the proposed hybrid model

    Figure 6: Accuracy and loss function results for deep autoencoder neural network with the Admax optimizer method: (a) Accuracy of the proposed hybrid model; (b) Loss function of the proposed hybrid model

    4.4 Performance Evaluation

    Fig.5 shows the results of training and testing accuracy and loss function for the autoencoder neural network with RMSprop optimizer.Fig.5a has presented the accuracy of the proposed hybrid model and Fig.5b is presented the loss function of the proposed hybrid model with RMSprop optimizer algorithm.

    Five measurement metrics are utilized to evaluate the performance of our experiment: Accuracy, Precision, Recall rate, F1-Score, and receiver operating characteristics (ROC).Accuracy is defined as the proportion of correctly classified samples and has the following formula:

    Precision is characterized as the proportion of those who truly belong to Category-A in all samples classified as such.In general, the higher the Precision, the lower the system’s False Alarm Rate (FAR).

    The recall rate indicates the proportion of all samples categorized as Category-A that are ultimately classified as such.The recall rate is a measure of a system’s capability to detect anomalies.The greater it is, the more anomalous traffic is correctly observed.

    The F1-score enables the combination of precision and recall into a single metric that encompasses both properties.

    TP, FP, TN, FN represent True Positive, False Positive, True Negative and False Negative,respectively.

    Accuracy is the most widely used metric for models trained using balanced datasets.This indicates the fraction of correctly estimated samples to the overall number of samples under evaluation for the model.Fig.7 shows the accuracy scores for the proposed anomaly detection models, determined from an independent test set.As depicted in Fig.7, out of five proposed models, the DANN-based Adamax optimizer model achieved an accuracy score of 99.40% followed by a 90.36% score of DANN-based Adam optimizer and DANN based objective function model.Although accuracy is a popular standard measure, it has drawbacks; mainly when there is a class imbalance in samples, it is often used along with other measures like F1 score or matthew’s correlation coefficient.

    F1-score is frequently employed in circumstances where an optimum integration of precision and recall is necessary.It is the harmonic mean of precision and recall scores of a model.Thus,the F1 score can be defined as given in Eq.(13).Fig.7 shows the F1 prediction values for anomaly detection models based on the five DANNs, which confirms the earlier performance validated using the AUC ratings.The DANN-based Adam optimizer model achieved an optimal F1 score of 0.9811, while the DANN-based Adamax optimizer model obtained second place with an F1 score of 0.9649.DANN and DANN-based SGD optimizer models showed comparable performance and achieved an F1 score of 0.9376 and 0.8823, respectively.DANN with RMSprop optimizer score was not that far from the aforementioned DANNs but earned the last place, with an F1-score of 0.8280.

    Figure 7: Precision, recall, F1-score and AUC achieved by DANN-based anomaly detection models

    A receiver operating characteristics (ROC) is a method for organizing, visualizing, and selecting classification models based on their performance [69].Additionally, it is a valuable performance evaluation measure, ROC curves are insensitive to changes in class distribution and especially useful for problems involving skewed class distributions [69].The ROC curve illuminates,in a sense, the cost-benefit analysis under evaluation of the classifier.The false positive (FP) ratio to total negative samples is defined as the false positive (FP) rate and measures the negative examples misclassified fraction as positive.This is considered a cost since any further action taken on the FP’s result is considered a waste, as it is a positive forecast.True positive rate, defined as the fraction of correctly predicted positive samples, can be considered an advantage due to the fact correctly predicted positive samples assist the classifier in resolving the examined problem more effectively.

    The proposed five models AUC values in this analysis are presented in the Legend portion of Fig.7.It is shown clearly from Fig.7 that the DANN-based Adamax optimizer model outperforms the rest of the methods in detection anomaly in a high dimensional real-life dataset, with an AUC value of 0.981.The model-based adam optimizer obtained the second-best prediction with an AUC value of 0.951.The AUC results obtained validate the earlier evaluation results indicated by the F1 score matric.

    When optimizing classification models, cross-entropy is often utilized as a loss function.Crossentropy as a loss function is extremely useful in binary classification problems that include the prediction of a class mark from one or more input variables.Our model attempts to estimate the target probability distribution Q as closely as possible.Thus, we can estimate the cross-entropy for an anomaly prediction in high dimensional data using the cross-entropy calculation given as follows:

    ? Predicted P(class0) =1-yhat

    ? Predicted P(class1) =yhat

    This implies that the model explicitly predicts the probability of class 1, while the probability of class 0 is given as one minus the expected probability.Fig.8 shows the average cross-entropy across all training data for the DANN-based adamax optimizer method, where the model has minimal function loss.Hence, this confirms that the proposed model is efficient and effective in predicting anomaly in high dimensional data.

    Figure 8: DANN+Adamax optimizer model for anomaly detection using cross-entropy as a loss function

    5 Comparison with Literature

    To detect the anomaly in high dimensionality industrial gas turbine dataset, we were unable to find any research contribution that has been evaluated, but we have compared the results with the two recently proposed approaches for anomaly detection in the high dimensional dataset [70,71]shown in Tab.2.The comparison is only shown for metrics available, but essentially, it shows the reader the promising results of the proposed DANN-based Adamax optimizer during the training process of the proposed model.The results show that the proposed method surpasses the two previous methods for detection anomaly in the high dimensional data set.

    As presented in Tab.2, the proposed detection model obtained a better result in detecting the anomaly and overcoming dimensionality’s curse without needing any complex and labor-intensive feature extraction.This is possible due to the inherent capability of DANNs to learn the taskspecific feature presentations automatically.Thus, the proposed DANN outperform the anomaly detection approach that is based on an Autoregressive Flow-based (ADAF) model [70] and the hybrid semi-supervised anomaly detection model suggested by [71].

    Table 2: Comparison of the proposed approach with related literature contributions

    6 Conclusion

    This study proposed an efficient and improved deep autoencoder based anomaly detection approach in real industrial gas turbine data set.The proposed approach aims at improving the accuracy of anomaly detection by reducing the dimensionality in the large gas turbine data.The proposed deep autoencoder neural networks (DANN) were integrated and tested with several wellknown optimization methods for the deep autoencoder training process.The proposed DANN approach was able to overcome the curse of dimensionality effectively.It evaluated based on commonly used evaluation measures to evaluate and validate the DANN models performance.The DANN-based Adamax optimization method has achieved the best performance with an accuracy of 99.40%, F1-score of 0.9649 and an AUC rate of 0.9649.At the same time, the DANN-based SGD optimization method obtained the worse performance in anomaly detection in the high dimensional dataset.

    Funding Statement:This research/paper was fully supported by Universiti Teknologi PETRONAS,under the Yayasan Universiti Teknologi PETRONAS (YUTP) Fundamental Research Grant Scheme (YUTP-015LC0-123).

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.

    日日爽夜夜爽网站| 十八禁网站免费在线| 国产精品野战在线观看 | 美女大奶头视频| 男女下面进入的视频免费午夜 | 国产成人影院久久av| 久久精品国产亚洲av高清一级| 满18在线观看网站| 操出白浆在线播放| 久久草成人影院| 久久国产精品人妻蜜桃| 免费一级毛片在线播放高清视频 | 国产99久久九九免费精品| 好看av亚洲va欧美ⅴa在| 久热爱精品视频在线9| 黑人巨大精品欧美一区二区蜜桃| av在线播放免费不卡| 99在线人妻在线中文字幕| 成在线人永久免费视频| 性色av乱码一区二区三区2| 18禁裸乳无遮挡免费网站照片 | 丝袜美足系列| 亚洲一区中文字幕在线| 欧美日韩av久久| 亚洲av片天天在线观看| 99香蕉大伊视频| 日本vs欧美在线观看视频| 午夜久久久在线观看| 夜夜躁狠狠躁天天躁| 日韩免费高清中文字幕av| 999久久久精品免费观看国产| 国产xxxxx性猛交| 色综合站精品国产| 桃红色精品国产亚洲av| 日本撒尿小便嘘嘘汇集6| 国产精品久久电影中文字幕| 色综合婷婷激情| 国产精品一区二区精品视频观看| 亚洲欧美精品综合一区二区三区| 久久久久久人人人人人| 精品久久久久久,| 88av欧美| 久久中文字幕一级| 两性夫妻黄色片| 日韩一卡2卡3卡4卡2021年| 多毛熟女@视频| 亚洲色图综合在线观看| 亚洲欧美精品综合一区二区三区| 男女午夜视频在线观看| 天堂动漫精品| 国产一区二区三区在线臀色熟女 | www.精华液| 美女福利国产在线| 女同久久另类99精品国产91| 国产av在哪里看| 欧美在线黄色| 久久精品人人爽人人爽视色| 久久精品91无色码中文字幕| 18禁黄网站禁片午夜丰满| 18禁裸乳无遮挡免费网站照片 | 久久精品成人免费网站| 精品福利永久在线观看| 免费看十八禁软件| 在线观看日韩欧美| 成人国语在线视频| 中文字幕最新亚洲高清| 咕卡用的链子| 亚洲 国产 在线| 男女做爰动态图高潮gif福利片 | 国产三级在线视频| 一区二区日韩欧美中文字幕| 99在线人妻在线中文字幕| 欧美激情高清一区二区三区| 又大又爽又粗| 97超级碰碰碰精品色视频在线观看| 亚洲欧洲精品一区二区精品久久久| 99久久99久久久精品蜜桃| 久久婷婷成人综合色麻豆| 欧美午夜高清在线| 亚洲一卡2卡3卡4卡5卡精品中文| 韩国av一区二区三区四区| av天堂在线播放| 国产av精品麻豆| 女人被狂操c到高潮| 欧美 亚洲 国产 日韩一| 国产精品久久久av美女十八| 18禁裸乳无遮挡免费网站照片 | 国产精品永久免费网站| 人人澡人人妻人| 怎么达到女性高潮| 亚洲五月天丁香| 青草久久国产| 在线播放国产精品三级| 亚洲av五月六月丁香网| 久久婷婷成人综合色麻豆| 乱人伦中国视频| 中文字幕人妻丝袜一区二区| 极品教师在线免费播放| 如日韩欧美国产精品一区二区三区| 人妻久久中文字幕网| 精品国内亚洲2022精品成人| 久久香蕉精品热| 亚洲欧美日韩无卡精品| 淫秽高清视频在线观看| 高清黄色对白视频在线免费看| 最近最新免费中文字幕在线| 欧美色视频一区免费| 亚洲avbb在线观看| 日日爽夜夜爽网站| 色哟哟哟哟哟哟| 欧美黑人精品巨大| 免费av毛片视频| 大陆偷拍与自拍| 十八禁人妻一区二区| 极品人妻少妇av视频| av天堂久久9| 国产成人精品久久二区二区免费| 国产亚洲精品久久久久久毛片| 新久久久久国产一级毛片| 国产不卡一卡二| 成人亚洲精品一区在线观看| 国产精品秋霞免费鲁丝片| 久久久国产一区二区| 看黄色毛片网站| 91精品国产国语对白视频| 美女福利国产在线| 超碰成人久久| 国产无遮挡羞羞视频在线观看| 亚洲第一av免费看| 亚洲精品粉嫩美女一区| 中国美女看黄片| 日本vs欧美在线观看视频| 高潮久久久久久久久久久不卡| 国产伦一二天堂av在线观看| 亚洲精品国产区一区二| 久久狼人影院| 国产有黄有色有爽视频| 欧美日韩福利视频一区二区| 国产激情久久老熟女| netflix在线观看网站| 中文亚洲av片在线观看爽| 99re在线观看精品视频| 亚洲av第一区精品v没综合| e午夜精品久久久久久久| 欧美乱码精品一区二区三区| 免费看a级黄色片| 国产三级黄色录像| 亚洲av第一区精品v没综合| 日韩av在线大香蕉| 色精品久久人妻99蜜桃| 在线观看日韩欧美| 天堂影院成人在线观看| 女性被躁到高潮视频| 大型黄色视频在线免费观看| 国产91精品成人一区二区三区| 在线看a的网站| 精品久久久久久电影网| 窝窝影院91人妻| 欧美成人午夜精品| 新久久久久国产一级毛片| 黄色视频不卡| 黄片大片在线免费观看| 法律面前人人平等表现在哪些方面| 成人精品一区二区免费| 精品国产超薄肉色丝袜足j| 99热只有精品国产| 国产av又大| 最近最新中文字幕大全电影3 | 久久久久久人人人人人| 国产精品久久久av美女十八| 午夜免费观看网址| 国产成人精品久久二区二区91| 国产麻豆69| 淫妇啪啪啪对白视频| 国产亚洲精品久久久久久毛片| 国产免费av片在线观看野外av| 国产成人精品无人区| 欧美丝袜亚洲另类 | 深夜精品福利| 在线观看午夜福利视频| √禁漫天堂资源中文www| 亚洲色图av天堂| 国产高清激情床上av| 亚洲一码二码三码区别大吗| av网站在线播放免费| 一级a爱视频在线免费观看| 亚洲片人在线观看| 日本免费a在线| 高清毛片免费观看视频网站 | 我的亚洲天堂| 九色亚洲精品在线播放| 搡老岳熟女国产| 精品福利永久在线观看| 黄色丝袜av网址大全| 超碰97精品在线观看| 国产黄a三级三级三级人| 国产精品美女特级片免费视频播放器 | 欧美乱色亚洲激情| 国产在线观看jvid| 欧美激情高清一区二区三区| 色综合欧美亚洲国产小说| 老熟妇乱子伦视频在线观看| 国产蜜桃级精品一区二区三区| 男女下面插进去视频免费观看| 欧美一级毛片孕妇| 咕卡用的链子| 成人av一区二区三区在线看| 免费在线观看完整版高清| 亚洲欧美一区二区三区黑人| 精品久久久久久电影网| 丝袜美足系列| 国产深夜福利视频在线观看| 女性生殖器流出的白浆| bbb黄色大片| 国产成人av激情在线播放| 一二三四在线观看免费中文在| 在线观看免费视频网站a站| а√天堂www在线а√下载| 精品熟女少妇八av免费久了| 欧美大码av| 国产伦一二天堂av在线观看| 国产精品影院久久| 欧美在线黄色| 91成人精品电影| 在线天堂中文资源库| 高清毛片免费观看视频网站 | 丝袜美足系列| 少妇粗大呻吟视频| 国产一卡二卡三卡精品| 亚洲人成电影观看| 9色porny在线观看| 侵犯人妻中文字幕一二三四区| 男女做爰动态图高潮gif福利片 | 亚洲成国产人片在线观看| 亚洲精品中文字幕在线视频| 男女下面插进去视频免费观看| 免费一级毛片在线播放高清视频 | 色精品久久人妻99蜜桃| 欧美国产精品va在线观看不卡| 欧美黑人欧美精品刺激| 欧美成人免费av一区二区三区| 欧美日韩亚洲高清精品| 青草久久国产| 亚洲欧美日韩另类电影网站| 狠狠狠狠99中文字幕| 脱女人内裤的视频| 美女高潮喷水抽搐中文字幕| 欧美中文综合在线视频| 日韩成人在线观看一区二区三区| 午夜老司机福利片| 成人av一区二区三区在线看| 国产色视频综合| 757午夜福利合集在线观看| 欧美激情极品国产一区二区三区| 精品久久久久久久久久免费视频 | aaaaa片日本免费| 激情视频va一区二区三区| 久久精品国产亚洲av高清一级| 看片在线看免费视频| 91麻豆精品激情在线观看国产 | 免费在线观看完整版高清| 久久久久国产一级毛片高清牌| 久久久久久人人人人人| 亚洲色图 男人天堂 中文字幕| 国产乱人伦免费视频| 成人亚洲精品一区在线观看| 丁香六月欧美| 欧美日本中文国产一区发布| av天堂久久9| 亚洲国产看品久久| 最新在线观看一区二区三区| 国产精品综合久久久久久久免费 | 日韩有码中文字幕| 黄色 视频免费看| 麻豆av在线久日| 99国产精品一区二区三区| 琪琪午夜伦伦电影理论片6080| 亚洲欧洲精品一区二区精品久久久| 亚洲第一av免费看| 免费在线观看亚洲国产| 亚洲 国产 在线| 久久九九热精品免费| 色老头精品视频在线观看| 露出奶头的视频| xxxhd国产人妻xxx| 国产成年人精品一区二区 | 黄色视频不卡| 看免费av毛片| 日本黄色日本黄色录像| 黄色怎么调成土黄色| 丰满的人妻完整版| 亚洲avbb在线观看| 久久狼人影院| 十分钟在线观看高清视频www| 999久久久国产精品视频| 在线观看免费日韩欧美大片| 亚洲中文字幕日韩| a在线观看视频网站| 手机成人av网站| 国产有黄有色有爽视频| 国产伦一二天堂av在线观看| 久久久国产精品麻豆| 怎么达到女性高潮| 久久精品影院6| 久久午夜综合久久蜜桃| 丰满的人妻完整版| 国产高清videossex| 亚洲第一av免费看| 精品午夜福利视频在线观看一区| 巨乳人妻的诱惑在线观看| 女人精品久久久久毛片| 涩涩av久久男人的天堂| 女人精品久久久久毛片| 波多野结衣高清无吗| 欧美人与性动交α欧美精品济南到| 麻豆成人av在线观看| 男女下面插进去视频免费观看| 午夜福利影视在线免费观看| 日本免费a在线| 欧美日韩国产mv在线观看视频| 国产黄a三级三级三级人| 亚洲在线自拍视频| 欧美黑人欧美精品刺激| 国产单亲对白刺激| 成人特级黄色片久久久久久久| 咕卡用的链子| 热re99久久国产66热| 丝袜在线中文字幕| av有码第一页| 欧美黄色淫秽网站| 国产精品一区二区精品视频观看| 日韩欧美三级三区| 人人妻人人澡人人看| 国产精品1区2区在线观看.| 国产成年人精品一区二区 | 黄色视频不卡| 久久久久久亚洲精品国产蜜桃av| 精品少妇一区二区三区视频日本电影| 女生性感内裤真人,穿戴方法视频| 成人亚洲精品av一区二区 | 最新美女视频免费是黄的| av欧美777| 精品国产乱码久久久久久男人| 亚洲国产欧美网| 亚洲精品国产精品久久久不卡| 午夜久久久在线观看| 一a级毛片在线观看| 亚洲第一欧美日韩一区二区三区| 精品国产一区二区三区四区第35| 欧美中文日本在线观看视频| 久久国产精品男人的天堂亚洲| 最新在线观看一区二区三区| 国产精品久久久av美女十八| 国产亚洲欧美精品永久| 在线视频色国产色| 成年女人毛片免费观看观看9| 久久国产乱子伦精品免费另类| 色婷婷av一区二区三区视频| 日韩成人在线观看一区二区三区| 亚洲欧美精品综合久久99| 亚洲成人精品中文字幕电影 | 国产99白浆流出| 18禁裸乳无遮挡免费网站照片 | 国产av一区在线观看免费| 婷婷六月久久综合丁香| 欧美大码av| a级片在线免费高清观看视频| 日本三级黄在线观看| 男女下面进入的视频免费午夜 | 亚洲av熟女| www日本在线高清视频| 18禁美女被吸乳视频| 亚洲自偷自拍图片 自拍| 久久人妻福利社区极品人妻图片| 欧美 亚洲 国产 日韩一| 一夜夜www| 欧美 亚洲 国产 日韩一| 精品免费久久久久久久清纯| 9热在线视频观看99| 国产精品 国内视频| 一级a爱视频在线免费观看| 色精品久久人妻99蜜桃| 日本黄色视频三级网站网址| 99在线视频只有这里精品首页| 热re99久久精品国产66热6| 久久性视频一级片| 精品电影一区二区在线| 亚洲欧美激情综合另类| 日韩中文字幕欧美一区二区| 国产欧美日韩一区二区精品| 自线自在国产av| 热99re8久久精品国产| 国产深夜福利视频在线观看| 久久久国产成人精品二区 | 久久人人爽av亚洲精品天堂| 成年人免费黄色播放视频| 久久国产亚洲av麻豆专区| 亚洲av电影在线进入| 国产熟女xx| 久久人妻熟女aⅴ| 亚洲一区中文字幕在线| 99riav亚洲国产免费| 热re99久久精品国产66热6| 国产欧美日韩一区二区精品| 乱人伦中国视频| 在线av久久热| 精品人妻在线不人妻| 丰满人妻熟妇乱又伦精品不卡| 一a级毛片在线观看| 丰满人妻熟妇乱又伦精品不卡| 久久中文看片网| 欧洲精品卡2卡3卡4卡5卡区| 国产亚洲精品第一综合不卡| 超色免费av| 亚洲欧美一区二区三区久久| 美女高潮喷水抽搐中文字幕| 精品国产一区二区久久| 97人妻天天添夜夜摸| 丝袜美腿诱惑在线| 国产高清国产精品国产三级| 日韩成人在线观看一区二区三区| 中文字幕人妻熟女乱码| 在线观看66精品国产| 国产伦人伦偷精品视频| 黄色毛片三级朝国网站| 欧美老熟妇乱子伦牲交| 成人亚洲精品av一区二区 | 99精品久久久久人妻精品| 男人舔女人的私密视频| 可以免费在线观看a视频的电影网站| 天天影视国产精品| svipshipincom国产片| 岛国视频午夜一区免费看| 怎么达到女性高潮| 黑丝袜美女国产一区| 老司机深夜福利视频在线观看| av网站免费在线观看视频| 欧美成人午夜精品| 日本一区二区免费在线视频| 国产成人系列免费观看| 国产片内射在线| 国产成年人精品一区二区 | 男女下面插进去视频免费观看| 一级a爱视频在线免费观看| 久久九九热精品免费| 男人舔女人的私密视频| 91麻豆av在线| 欧美老熟妇乱子伦牲交| 精品高清国产在线一区| 男女之事视频高清在线观看| 亚洲欧美日韩另类电影网站| 久99久视频精品免费| 亚洲成人免费av在线播放| www.精华液| 人人妻人人爽人人添夜夜欢视频| 亚洲av片天天在线观看| 69av精品久久久久久| 亚洲人成网站在线播放欧美日韩| 别揉我奶头~嗯~啊~动态视频| 亚洲精品一区av在线观看| 精品午夜福利视频在线观看一区| 一级a爱视频在线免费观看| 天天添夜夜摸| av免费在线观看网站| 高清毛片免费观看视频网站 | 久久久久国内视频| 日本免费一区二区三区高清不卡 | 欧美乱色亚洲激情| 在线观看午夜福利视频| 欧美+亚洲+日韩+国产| 中出人妻视频一区二区| 亚洲精品一二三| 日日夜夜操网爽| x7x7x7水蜜桃| 久久国产乱子伦精品免费另类| svipshipincom国产片| 国产欧美日韩精品亚洲av| 91九色精品人成在线观看| 交换朋友夫妻互换小说| 丝袜在线中文字幕| 久久精品国产99精品国产亚洲性色 | 一边摸一边做爽爽视频免费| 黄色片一级片一级黄色片| 丁香欧美五月| 两个人看的免费小视频| 90打野战视频偷拍视频| 国产精品99久久99久久久不卡| 波多野结衣一区麻豆| 午夜影院日韩av| 国产av一区在线观看免费| 国产黄a三级三级三级人| 高清欧美精品videossex| 午夜福利在线观看吧| 欧美日韩亚洲高清精品| 色综合欧美亚洲国产小说| 午夜福利,免费看| 大香蕉久久成人网| 99久久人妻综合| 欧美中文综合在线视频| 19禁男女啪啪无遮挡网站| 久久精品亚洲熟妇少妇任你| 欧美乱妇无乱码| 日本五十路高清| 丰满人妻熟妇乱又伦精品不卡| 亚洲精品国产精品久久久不卡| 欧美性长视频在线观看| 两性夫妻黄色片| 村上凉子中文字幕在线| 最近最新中文字幕大全电影3 | 大型黄色视频在线免费观看| 国产亚洲欧美精品永久| 在线天堂中文资源库| 在线十欧美十亚洲十日本专区| 欧美乱妇无乱码| 亚洲成人精品中文字幕电影 | 亚洲国产看品久久| 脱女人内裤的视频| 亚洲一码二码三码区别大吗| 麻豆久久精品国产亚洲av | 欧美老熟妇乱子伦牲交| 欧美日韩国产mv在线观看视频| 中国美女看黄片| 神马国产精品三级电影在线观看 | 五月开心婷婷网| 亚洲成国产人片在线观看| 天天影视国产精品| 校园春色视频在线观看| 18禁国产床啪视频网站| 精品国产乱子伦一区二区三区| 琪琪午夜伦伦电影理论片6080| 国产日韩一区二区三区精品不卡| 91精品三级在线观看| 看片在线看免费视频| 叶爱在线成人免费视频播放| 欧美不卡视频在线免费观看 | 国产国语露脸激情在线看| 黑人巨大精品欧美一区二区蜜桃| 女人精品久久久久毛片| 国产在线观看jvid| 成人18禁在线播放| 老汉色av国产亚洲站长工具| 色播在线永久视频| 久久精品国产亚洲av高清一级| 又紧又爽又黄一区二区| 中文欧美无线码| 成人国产一区最新在线观看| 成年女人毛片免费观看观看9| 韩国精品一区二区三区| 97超级碰碰碰精品色视频在线观看| 欧美av亚洲av综合av国产av| 黑人猛操日本美女一级片| 亚洲av成人av| 精品国产乱子伦一区二区三区| 亚洲精品国产精品久久久不卡| 咕卡用的链子| 欧美在线一区亚洲| 久久精品91蜜桃| 亚洲一卡2卡3卡4卡5卡精品中文| 欧美亚洲日本最大视频资源| 久久九九热精品免费| 999精品在线视频| 99精品久久久久人妻精品| www.自偷自拍.com| 久久这里只有精品19| 色在线成人网| av网站在线播放免费| 免费av毛片视频| 50天的宝宝边吃奶边哭怎么回事| 欧美激情极品国产一区二区三区| 亚洲午夜理论影院| 亚洲精品中文字幕一二三四区| 国产精品1区2区在线观看.| 国产精品偷伦视频观看了| 亚洲国产欧美日韩在线播放| 午夜老司机福利片| 亚洲自拍偷在线| 国产激情久久老熟女| 免费在线观看完整版高清| 啦啦啦在线免费观看视频4| 成在线人永久免费视频| 国产亚洲精品一区二区www| 国产成人精品久久二区二区91| 欧美成人午夜精品| 午夜精品久久久久久毛片777| 91字幕亚洲| 可以免费在线观看a视频的电影网站| 久久亚洲精品不卡| 黑人欧美特级aaaaaa片| 琪琪午夜伦伦电影理论片6080| av电影中文网址| 久久久久久久久久久久大奶| 午夜两性在线视频| 国产欧美日韩综合在线一区二区| 好男人电影高清在线观看| www.精华液| 极品教师在线免费播放| 国产成人欧美| 色尼玛亚洲综合影院| 国产有黄有色有爽视频| 成人av一区二区三区在线看| 伦理电影免费视频| 国产伦人伦偷精品视频| 男人舔女人的私密视频| 久久热在线av| 99国产精品一区二区蜜桃av| 日韩欧美国产一区二区入口| 国产日韩一区二区三区精品不卡| 满18在线观看网站| 嫩草影视91久久| 91老司机精品| 国产精品 国内视频| 无遮挡黄片免费观看| 久久 成人 亚洲| 久久久久九九精品影院| 国产不卡一卡二|