• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    An Improved DeepNN with Feature Ranking for Covid-19 Detection

    2022-08-24 03:26:22NohaElAttarSaharSabbehHebaFasihuddinandWaelAwad
    Computers Materials&Continua 2022年5期

    Noha E.El-Attar,Sahar F.Sabbeh,Heba Fasihuddin and Wael A.Awad

    1Faculty of Computers and Artificial Intelligence,Benha University,Benha,13518,Egypt

    2University of Jeddah,College of Computer Science and Engineering,Jeddah,21493,Kingdom of Saudi Arabia

    3Faculty of Computers and Information,Damietta University,Damietta,34711,Egypt

    Abstract:The outbreak of Covid-19 has taken the lives of many patients so far.The symptoms of COVID-19 include muscle pains, loss of taste and smell,coughs, fever, and sore throat, which can lead to severe cases of breathing difficulties,organ failure,and death.Thus,the early detection of the virus is very crucial.COVID-19 can be detected using clinical tests, making us need to know the most important symptoms/features that can enhance the decision process.In this work, we propose a modified multilayer perceptron (MLP)with feature selection(MLPFS)to predict the positive COVID-19 cases based on symptoms and features from patients’electronic medical records(EMR).MLPFS model includes a layer that identifies the most informative symptoms to minimize the number of symptoms base on their relative importance.Training the model with only the highest informative symptoms can fasten the learning process and increase accuracy.Experiments were conducted using three different COVID-19 datasets and eight different models,including the proposed MLPFS.Results show that MLPFS achieves the best feature reduction across all datasets compared to all other experimented models.Additionally,it outperforms the other models in classification results as well as time.

    Keywords: Covid-19;feature selection;deep learning

    1 Introduction

    The COVID-19 virus is still spreading rapidly.Although some vaccines have been developed to provide acquired immunity against Covid19, no vaccine exists to completely prevent coronavirus infection in humans.Therefore,early diagnosis of(COVID-19)patients is crucial for disease diagnosis and control.As a result, it is essential to classify and analyze COVID-19 data, particularly in epidemic areas, to save medical experts’time and effort.Data mining is considered an effective technique for detecting and predicting several medical issues due to its ability to find and extract meaningful information and patterns from medical datasets.There are several types of data miningbased classification algorithms.To name a few, artificial neural networks (ANN), support vector machines(SVM),K-nearest neighbors(KNN),and random forest(RF)[1].Even though classification models are primarily reliant on extracting features that characterize data instances, these extracted features may contain lots of unnecessary or redundant features.From this standpoint, selecting the most important and relevant features,especially for high-dimensional data,enhances the classification model accuracy and minimizes the time cost.

    In general, the traditional feature selection approaches are classified into three types; filter approach, wrapper approach, and embedded approach, as shown in Fig.1 [2].Filter methods are classifier-independent, where they select the features’subsets based on specific given criteria [3].Correlation-based feature selection, fast correlated-based filter, and Relief are examples of filter methods [4].The main drawback of this methodology is that it ignores feature dependencies and relationships between classifiers, resulting in a misclassified model [5].On the other hand, wrapper and embedded techniques are classifier-dependent.Their methods take into consideration feature interaction, resulting in increased algorithm classification accuracy.Wrapper approaches, such as forward selection and backward elimination,assess a potential subset based on the accuracy rate of a given classifier.

    Figure 1:Feature selection approaches

    In contrast,embedded methods(e.g.,LASSO and RIDGE regression)incorporate feature selection inside the classification process.Although wrapper and embedded methods usually gain a high level of classification accuracy,they take a longer time to execute than filter approaches[3].Therefore,many researchers have developed hybrid approaches to select the most critical features[5].

    Machine learning models have been widely leveraged in big data to establish efficient predictive models.However, a critical issue with ML techniques is the high dimension of the dataset.For instance,sometimes,the feature subset size is much larger than the pattern size,which may decrease the classifier’s performance.Therefore,determining the feature importance for high-dimensional variables and data has gotten attention in recent years to improve the accuracy of classification models[6].

    One of the recent trends to enhance classification accuracy is using deep learning networks by creating more profound and more sophisticated networks such as convolutional neural networks,recurrent neural networks, radial basis function neural networks, and multilayer perceptrons [7].Deep neural networks (DNN) have generally been viewed as black-box techniques due to their complicated construction.Despite their high predictive power in many learning problems(e.g.,system identification,text classification,pattern recognition,and medical diagnosis),they don’t reveal which features significantly impact prediction accuracy.In addition,the trainable parameters needed by these networks in both the training and testing stages have become increasingly complicated,necessitating a large amount of memory and processing resources [7].Feature selection could be adopted and integrated during the learning process to eliminate non-significant features based on a particular fitness function value of a predefined threshold and help identify the inputs that control the outcome[8,9].In common,feature selection(FS)algorithms are used as preprocessors before learning to detect the most relevant features and neglect the non-significant ones.After that, the machine learning algorithm gives the same importance level to all extracted features and applies the induction algorithm[10].This technique may reduce classification performance because less relevant characteristics may play a significant role in the learning problem.As a result,a ranking based on the importance of the features of the learning problems may be more valuable[11].

    In recent years,different strategies have been presented to perform the feature importance ranking(FIR) process for DNNs, such as regularization, greedy search, and averaged input gradient [11].Regularization strategies are based on detracting the first hidden layer’s weights,which helps reduce the number of parameters in a DNN and finding heuristic approaches for feature selection.Lasso and random forest are examples of regularization strategies[8].The Greedy search strategy effectively optimizes problems to find the optimal global solution at an acceptable response time.However,this strategy ordinarily causes high computational costs and may produce solutions far from the optimal ones.Averaged input gradient is an approach to population-wise FIR methods,which uses the average of all prominence maps derived from individual instances and the global population[11].

    The contribution of this paper is to modify the classification model to learn faster according to the weight of features’importance.Rather than dealing with all features identically,the model will handle the features according to their importance and utilize this importance’knowledge in the learning process, leading to better classification accuracy.This study proposes a modified MLP network by adding a new hidden layer that plays the feature selection process role.The proposed FS layer depends on ranking the features’importance according to their weights based on a threshold produced by a nonlinear function.The more important the features, the more discriminative power they have.The rest of the paper is organized as follows;Section 2 displays the literature review on feature selection and feature importance and their effectiveness on classification models.Section 3 represents the materials and methodologies as preliminaries to the proposed algorithm.The proposed algorithm is discussed in Section 4,followed by the experiment details and the final results in Section 5.Finally,the conclusion is reviewed in Section 6.

    2 Literature Review

    As neural networks have been widely used in classification problems,many researchers have tended to enhance their capabilities in learning and classification by modifying their structure or applying feature selection techniques.For instance, in Wang et al.[12], have combined a bottom-up feature extraction and a top-down cognitive bias into one cohesive structure to produce a new attentional neural network structure.As the authors mentioned, this framework could efficiently handle noisy and complicated segmentation problems with a high level of accuracy.Another study used neural networks in feature selection presented by Bugata et al.[9].In this study, the authors developed neural networks to find the significant input variables and produce a supervised feature selection method called sparse neural network layer with normalizing constraints.Also,Mocanu et al.[13]have replaced the traditional ANN’s fully connected layer with sparse layers to enhance the classification accuracy.Ho et al.[8] have developed a theoretical criterion for using the adaptive group lasso to obtain significant features of NNs.Akintola et al.[14] have scrutinized the influence of filter-based classification techniques on predicting software defects.They have evaluated the FilterSubsetEval,CFS,and PCA algorithms on available datasets from the metric data program software repository and NASA.They have used Na?ve Bayes and KNN,J48 Decision Tree,and MLP as classifiers.Another novel approach to constrained feature selection has been proposed by Rostami et al.[15].The presented approach is a pairwise constraints-based method for feature selection and dimensionality reduction.

    Increasing the data dimension encouraged researchers to adopt different types of DNN due to their robust capabilities in classification and prediction.Therefore, several works of literature have developed feature selection and feature ranking techniques to improve the classification performance of the DNNs.In Liu et al.[16],have proposed an improved feature derivation and selection method for a hybrid deep learning approach.They have used the deep neural network(DNN)and multilayer bi-directional long short-term memory(BiLSTM)with the attention mechanism to forecast overdue repayment behavior.Jiang et al.[17] have proposed an improved feature selection approach by integrating the CNN with the Relief algorithm.Also, Kaddar et al.in [7] have used the ANOVA technique to find the non-redundant representation in CNN by obtaining the feature maps with various neuron responses.In the same context,Nasir et al.[18]have proposed a deep convolutional neural network (DCNN) for real-time document classification based on the Pearson correlation coefficient to select the optimal feature subset.Another study using correlation coefficient and the automatic modulation classification (AMC) scheme has been presented by Lee et al.[19].In [5], a feature selection framework based on recurrent neural networks (RNN) has been proposed.This research presented different feature selection approaches based on the RNN architecture:long shortterm memory (LSTM), bidirectional LSTM, and gated recurrent unit (GRU).Also, a deep neural network-based feature selection(NeuralFS)was presented in[20].Another supervised feature selection approach based on developing the first layer in DNN has been presented in[21]and[6].

    As we mentioned before, FIR is another direction to enhance classification performance, as presented by Iqbal[11].The authors of this study introduced a Correlation Assisted Neural Network(CANN) that calculates the feature importance weight based on the correlation coefficient between the class label and the features.Furthermore,M.Wojtas and K.Chen have handled the populationwise feature importance ranking by presenting an innovative dual-network architecture to obtain an optimal feature subset and rank the importance of the selected features[22].

    Several studies have been interested in increasing the classification performance of clinical diagnosis systems by selecting the essential features.For instance,Bron et al.[23]have introduced feature selection based on SVM weights to enhance the computer-aided diagnosis system for dementia.Also,in Christo et al.[24],have proposed a correlation-based ensemble feature selector for clinical diagnosis.They have adapted three types of evolutionary algorithms (i.e., the lion optimization algorithm, the glowworm swarm optimization algorithm,and differential evolution)for feature selection.Then,they applied AdaBoostSVM as a classifier in a gradient descendant backpropagation neural network.In recent years, neural networks have been widely combined and integrated with several FS methods to enhance classification performance, such as combining NN with a feature selection method to analyze the specificity of HIV-1 protease[25].Likewise,integrating NN and 10-fold cross-validations to diagnose liver cancer[26].Combining the paired-input nonlinear knockoff filter with the MLP in[27].Random forest in[28],Naive Bayes,and SVM classifiers[29]are other algorithms used in recent studies of FS problems.

    Regarding COVID-19,which is still the focus of the world’s attention,several works of literature have tried to handle different aspects of this disease.To name a few, analyzing its viral infections,classifying its textual clinical reports,and detecting the significant features for predicting its patients.For instance,Khanday et al.[30] have proposed a learning model to detect COVID-19 from textual clinical reports based on classical and ensemble machine learning algorithms.They used term frequency/inverse document frequency,report length,a bag of words as feature engineering techniques,and logistic regression(LR)with multinomial Na?ve Bayes as a classifier.Likewise,Avila et al.[31]have introduced a Na?ve-Bayes machine learning model to predict qRT-PCR test results,considered one of the most widely used clinical exams for COVID-19.Another machine learning algorithm based on feature importance has been presented in by Mondal et al.[32]to diagnose COVID-19.In this study,the authors applied MLP,XGBoost,and LR to classify COVID-19 patients based on a clinical dataset from Brazil.In,one more COVID-19 diagnosis strategy has been proposed by Shaban et al.[1].This strategy is based on a novel hybrid feature selection method that consists of two stages to select the essential features:a fast selection stage as a filter selection method and an accurate selection stage based on a genetic algorithm as a wrapper selection method.As another use for machine learning,Mollalo et al.[33] have used MLP and LR to forecast the cumulative COVID-19 incidence rates across the United States.Finally,Tab.1 summarizes some of the recent studies that address the FS importance.

    Table 1:Recent studies in using ML and DL in feature selection

    Table 1:Continued

    Table 1:Continued

    3 Materials and Methods

    3.1 Data Collection

    Clinical reports and laboratory analysis are two of the most popular tools for diagnosing Covid-19 cases.Three types of datasets in the form of clinical reports have been used in this study.The first dataset is available at1https://www.kaggle.com/einsteindata4u/covid19.This dataset was collected from the SARS-CoV-2 RT-PCR and other laboratory tests performed on nearly 6000 Covid-19 cases during their visits to the emergency room.It includes 109 features and one class label.The second covid-19 dataset includes clinical features for symptomatic and asymptomatic patients(e.g.,comorbidities,vitals,epidemiologic factors,clinicianassessed symptoms,and patient-reported symptoms).This dataset consists of 34475 records with 41 features and one class label and can be found at2https://github.com/mdcollab/covidclinicaldata.The third dataset focuses on predicting intensive care unit(ICU)admission for positive COVID-19 cases based on clinical data.It comprises 1926 cases with 228 features and one class label,and it is available at3https://www.kaggle.com/S%C3%ADrio-Libanes/covid19.

    3.2 Preliminaries

    Recently,constructing learning networks deeper and more complicated has emerged to improve the performance of learning and classification processes.Consequently, the trainable parameters required by these deep networks in the learning process(i.e.,training and testing phases)have become increasingly complicated,which necessitates using a massive amount of power and memory resources.A multilayer neural network such as the multilayer perceptron(MLP) model is an example of these types of deep networks.MLP is a feedforward neural network containing hidden layers between its input and output layers,as depicted in Fig.2.It is classified as a supervised learning algorithm used for classification and regression[34].

    Figure 2:The architecture of multilayer perceptron network

    In more detail,each layer in MLP is made up of nodes and contains its weight matrix,W,and biasb.Each node in the fully connected network has a connection with every node in the following layers.The input layer distributes the inputs to the subsequent layers based on a linear activation function without a threshold by equation.After that,the hidden layers process the input values based on one or more nonlinear activation functions to feed the output layer.Finally,the output layer used a linear activation function to produce the outcomeby Eqs.(1)and(2)[34,35].

    Some additional notations are used to distinguish between the variables of the hidden layers as follows:superscripts to define the number of the layer, and subscripts to define the number of the neurons in the current layer(e.g.,w21,means the weight value for neuron number 1 in layer number 2)[34].The pseudo-code of the original MLP algorithm is displayed in Algorithm 1[36].

    Algorithm 1:MLP Original Algorithm Input:The features vector for each user R Begin 1.Initialize the weight vector W 2.While error does not coverage do 3.For all patterns P do:4.For all output Nodes j 5.Calculate activation function j 6.Calculate error for output j 7.For all input Nodes i to output node j 8.Calculate Δweight=Error_j*Activation_i 9.New_weight=weight*Δweight 10.End for 11.End for 12.End for 13.End while End

    4 The Proposed Multilayer Perceptron Network-Based Feature Selection(MLPFS)

    In this study, a modified MLP with an additional FS layer has been proposed to improve the accuracy of Covid 19 detection.As depicted in Fig.3,the proposed MLPFS architecture is constructed from three bunches.The initial bunch consists of the input and feature selection layers, which are connected in a one-to-one manner.Every input node in the input layer only connects with its corresponding node in the FS layer.The second bunch contains the hidden layers in which nodes are fully connected and pass forward.Finally, the third bunch is the output layer that uses a nonlinear activation function to produce a binary output form.The pseudo-code of the proposed MLPFS algorithm is shown in Algorithm 2.

    Figure 3:The architecture of the proposed MLPFS network

    Algorithm 2:MLPFS Algorithm Inputs are divided into two sets as 60%training_set and 40%testing_set Begin:1.Initialize the DNN model 2.Initialize one input layer(L1),one fet_sel layer(L2),two hidden layers(L3,L4),and one output layer(L5)3.Initialize input vector as X1[n],where n is the number of input features/*input layer L1 4.For each x1 i=1 to n 5.Calculate the weight of fet_sel W1 i by W1 i =G(0,■2/n), ?i=1,...,n 6.Calculate the Relu function for Wfs by Relu(W1 i)=max(0,W1 i), ?i=1,...,n/*The output of fet_sel layer L2 7.Calculate the output H2 i by H2 i =X ° Relu(Wfs), ?i=1,...,n 8.End for 9.While errors do not converge:do 10.For each h3 i =1 to n/*inputs for hidden neurons in L4 11.Define the weights value randomly between[0,1]12.For each h2i =1 to n/*inputs for hidden neurons in L3 13.Define the weights value randomly between[0,1]/*output of layer 3 14.Calculate output O3 i by Eq.(4)15.Apply Relu activation function for hidden layer L3 16.End for/*output of layer 4 17.Calculate output O4 i =WL i .HL-1 i +bL, ?i=1,...,n&L=1,...,5 18.Apply Relu activation function for all hidden layer L4 19.End for 20.For each O4 i by OL i =1 to n/*Output layer L5(Continued)

    21.Define the value of the weight randomly between[0,1]22.Calculate the cross-entropy cost function by:CE =-nimages/BZ_170_720_420_768_466.pngi=1(yi log(hL i)+ (1-yi).log (1-hL i))?i=1,...,n 23.Calculate the sigmoid activation function to find the net5 outcome 24.End for 25.Train the model 26.Net=train(inputs,selected features,net_outcome,desired_output)27.Calculate the cross-entropy cost function 28.Apply backward propagation and update weights 29.Calculate accuracy 30.End while End

    1-Input and Feature Selection Bunch

    This bunch is considered the adaptive phase in the proposed MLPFS network,where it contains the FS layer that modifies its node weights based on the importance of the original features during the training phase to keep only the features that surpass the Rectified Linear Unit(ReLU)activation function[37].Initially,the number of neurons in the input layer is always equal to the number of the original features of the utilized dataset, and every input neuron connects with only one neuron in the FS layer.The initial weights of these connections are justified based on a modified version of the Gaussian distribution in Eq.(3)[38].

    where,nis the number of input features,andWi1is the weight vector of the FS layer.The Relu activation function is utilized on the generated weights by Eq.(4)to calculate the feature importance value.The Relu function will keep only positive values and change all negative values to zero and neglect them.After that, the element-wise multiplication between the Relu value of the produced weights vectorWi1and the original features vectorXis calculated by Eq.(5)to produce the output of every node of the FS layerHi2[6,37].

    wheremis the number of the significant features.

    By completing this bunch, the learning rates of the connections between the output of the FS layer(i.e.,new values of input features)and the first hidden layer nodes will be adjusted by the new feature importance values.In more detail,based on the Relu threshold,all produced negative weights will be converted to zero and omitted.Then the heuristic process begins by ranking the importance of the features with positive weights(i.e.,more essential features have higher overall connection weights,while redundant and irrelevant features have a lower overall connection weight).Finally,the outcome of this bunch will feed the rest of the network to achieve the learning process based on the chosen classifier.

    2-Hidden Layers and Output Bunches

    The input of this bunch is the selected featuresH2mresulted from the FS layer.In MLP,the number of hidden layers and the number of hidden neurons influence the learning accuracy.As is common in deep networks, more than one hidden layer will improve classification accuracy.Still, it may cause over-fitting training data if the network is trained for too many epochs [39].In this study, we have tried to overcome this challenge by using two hidden layersH3k,H4kin the proposed MLP with 50 epochs for network training.About deciding the optimal number of neurons in each hidden layer,we have adopted the rule of thumb presented in[40].As shown in Eq.(6),the number of hidden neurons in each hidden layerkcan be calculated as follows:

    Tab.2 displays the parameters for the proposed MLPFS according to the utilized dataset.The utilized activation functions in the hidden layers and the output layer are Relu and sigmoid,respectively.Eqs.(7)and(8)calculate the outcome of the output layer.

    Table 2:The parameters of the MLPFS network for the utilized datasets

    5 Experiment Results and Discussion

    The proposed MLP based FS (MLPFS) algorithm results are shown here.The effectiveness of these proposed enhancements is highlighted,demonstrating the MLPFS algorithm’s performance in feature selection and classification processes.As indicated in Tabs.4–6, we compared the proposed algorithm to other existing techniques to have a consistent comparison.The proposed MLPFS algorithm is compared to some filter-based feature selection algorithms with different classifiers such as Pearson correlation with neural network[41],chi-square with neural network[42],Chi-square with support vector machine [43], chi-square with boosted decision tree [44], and chi-square and logistic regression[45].Also,the proposed MLPFS is compared to wrapper-based feature selection algorithms such as deep SVM[46]and cancelOut deep neural networks[6].The utilized algorithms were tested on the three covid-19 data sets and evaluated based on their predictive accuracy and processing time.The parameters used in all of the algorithms are shown in Tab.3.Python was used to implement this experiment using the Keras 0.2.0 library,integrated into the TensorFlow open-source library[47].The implementation has been run on an Intel(R)Core i7 2.81 GHz CPU with 8 GB RAM and the Windows 10 operating system.

    Table 3:The setting of parameters for the tested algorithms used in the evaluation

    5.1 Performance Evaluation Measures

    To evaluate the performance of the proposed MLPFS algorithm, the statistical results for all algorithms used in the comparison are calculated, including the number of selected features (SF),AUC,and processing time.In addition,confusion matrix values(i.e.,TP,TN,F(xiàn)P,and FN)are used to find the classifier’s performance in terms of accuracy,precision,sensitivity,and F score[48].

    ·Accuracy:The accuracy metric(Acc)identifies the correct data classification rate.It is calculated by Eq.(9):

    ·Precision:it presents the ratio of true positives to all the predicted positive patterns.It is calculated by Eq.(10):

    ·Sensitivity or Recall:it presents the ratio of true positives to all the positives patterns in the dataset.It is calculated by Eq.(11):

    ·F1 score:it measures the accuracy of the model on the dataset.It can be calculated by Eq.(12):

    5.2 Statistical Results Analysis

    The performance of the proposed MLPFS algorithm for feature selection and classification is investigated in terms of feature size,accuracy,and processing time.The final result of feature size in the three utilized datasets is displayed in Tabs.4–6.As shown,the proposed MLPFS model achieved a higher reduction in features compared to the other experimented models.It achieved almost 33%feature reduction for the first dataset, 34% reduction for the second dataset, and 35% reduction for the third dataset.The scored feature reduction is higher than the other models by at least 7%for the first dataset, 8% for the second dataset, and around 13% for the third dataset.MLPFS succeeded in identifying the most common COVID-19 symptoms amongst the most informative features concerning the features’importance.For instance, applying MLPFS on the second Covid-19 data set gives the highest weight to smell loss, respiratory rate, and cough severity.On the other hand,these same symptoms came with less weights in the cancelout DNN algorithm[6].

    Table 4:Comparison between the proposed approaches based on accuracy and time for SARS-CoV-2 RT-PCR dataset

    Table 5:Comparison between the proposed approaches based on accuracy and time for second Covid-19 dataset

    Table 5:Continued

    Table 6:Comparison between the proposed approaches based on accuracy and time for ICU dataset

    Moreover,MLPFS has recorded an accuracy rate of 91.4%,98.4%,and 88.4%for the SARS-CoV-2 RT-PCR, second Covid-19, and ICU datasets, respectively.Cancelout DNN achieved the second higher accuracy with 90.9%, 97.4%, and 84.5% for the three datasets.The Chi & LR achieved the third-highest accuracies with 90.6%, 97.4%, and 80.5% for the three datasets.Despite achieving the same accuracy for dataset1,Chi-square&NN achieved higher accuracy by around 1%than Pearson&NN for both dataset2 and dataset3.Chi&SVM and Chi&boosted DT achieved almost the same accuracies for all the datasets.Deep SVM achieved the lowest accuracy of 66.7%for the first dataset,with a high deviation from an average of around 0.2.

    Figure 4:Computation time for the tested algorithms on SARS-CoV-2 RT-PCR data set

    Additionally,results indicate that MLPFS achieved at least 8%and 2%higher precision for the first and third datasets.Also,it achieved nearly 2%,1%higher recall and 26%,9%higher f1 score for the second and third datasets.Regarding the processing time,as depicted in Figs.4–6,the MLPFS has been recorded as the minimum processing time compared to the other experimented models where it finished the complete process in 7.6, 28.7, and 5.01 s for the three used datasets respectively.The Cancelout DNN is next in speed with 8.2,29.5,and 5.2 s.Finally,validation accuracy per epoch for the eight algorithms on the three utilized datasets is displayed in Figs.7–9.

    Figure 5:Computation time for the tested algorithms on second Covid-19 data set

    Figure 6:Computation time for the tested algorithms on ICU data set

    Figure 7:Validation accuracy per epoch for SARS-CoV-2 RT-PCR

    Figure 8:Validation accuracy per epoch for second Covid-19 dataset

    Figure 9:Validation accuracy per epoch for ICU dataset

    6 Conclusion

    Up to now, COVID-19 is still a pandemic and threatens the lives of many people.Data mining techniques play an essential role in diagnosing and treating COVID-19.This work presents MLPFS,an MLP-based classification model for COVID-19 prediction with a feature selection and weighting layer.Three Clinical COVID-19 datasets were used for our experiment.MLPFS’s performance was evaluated against seven different classification models.The evaluation results showed that MLPFS outperformed all the other tested in terms of accuracy indicators,number of extracted features,and processing time.

    Funding Statement:The authors received no specific funding for this study.

    Conflicts of Interest:The authors have declared that there is no conflict of interest.And there are non-financial competing interests.

    九色成人免费人妻av| 亚洲欧美日韩高清专用| 亚洲乱码一区二区免费版| 亚洲图色成人| 淫秽高清视频在线观看| 我的女老师完整版在线观看| 丰满人妻一区二区三区视频av| 久久精品国产自在天天线| 欧美性猛交╳xxx乱大交人| 日韩成人av中文字幕在线观看| 岛国毛片在线播放| 国产国拍精品亚洲av在线观看| 久久热精品热| 欧美三级亚洲精品| av在线播放精品| 最近的中文字幕免费完整| 人人妻人人澡人人爽人人夜夜 | 全区人妻精品视频| 亚洲av二区三区四区| 精品熟女少妇av免费看| 免费搜索国产男女视频| 国产伦精品一区二区三区视频9| 日韩欧美 国产精品| 婷婷亚洲欧美| 成人美女网站在线观看视频| 99精品在免费线老司机午夜| 欧美激情久久久久久爽电影| 亚洲欧美精品自产自拍| 人妻夜夜爽99麻豆av| 人妻夜夜爽99麻豆av| 国产爱豆传媒在线观看| 22中文网久久字幕| 久久久久久久久久久免费av| 久久久久久久久久久丰满| 久久久久网色| 免费看日本二区| 国产精品电影一区二区三区| 老熟妇乱子伦视频在线观看| 久久精品国产亚洲av涩爱 | 久久中文看片网| 激情 狠狠 欧美| 男人和女人高潮做爰伦理| 男人狂女人下面高潮的视频| 精品久久久噜噜| www.av在线官网国产| 两个人的视频大全免费| 亚洲欧美日韩无卡精品| 99热只有精品国产| 美女xxoo啪啪120秒动态图| 国产精品精品国产色婷婷| 婷婷色综合大香蕉| 97人妻精品一区二区三区麻豆| 免费不卡的大黄色大毛片视频在线观看 | av在线播放精品| 亚洲aⅴ乱码一区二区在线播放| 亚洲四区av| 精品久久久久久成人av| 天堂网av新在线| 色综合色国产| 久久久久免费精品人妻一区二区| 日日摸夜夜添夜夜添av毛片| 丰满乱子伦码专区| 免费看光身美女| 高清在线视频一区二区三区 | 精品99又大又爽又粗少妇毛片| 中国美白少妇内射xxxbb| 午夜老司机福利剧场| 九九在线视频观看精品| 亚洲精品乱码久久久久久按摩| 日韩中字成人| 国产精品一区二区三区四区久久| 日韩三级伦理在线观看| 床上黄色一级片| 国产高清有码在线观看视频| 简卡轻食公司| 如何舔出高潮| 国产综合懂色| kizo精华| 99久久九九国产精品国产免费| ponron亚洲| 久久亚洲精品不卡| 欧美一区二区亚洲| 成人av在线播放网站| 日本三级黄在线观看| 国产激情偷乱视频一区二区| 国产高清不卡午夜福利| av国产免费在线观看| 免费电影在线观看免费观看| av视频在线观看入口| 日韩 亚洲 欧美在线| 99热这里只有精品一区| 可以在线观看毛片的网站| av天堂中文字幕网| 国产爱豆传媒在线观看| 久久这里只有精品中国| 中文字幕熟女人妻在线| 亚洲欧美中文字幕日韩二区| 一本精品99久久精品77| 国产综合懂色| 99久久九九国产精品国产免费| 国产三级在线视频| 久久久久久大精品| 国产老妇伦熟女老妇高清| 一卡2卡三卡四卡精品乱码亚洲| 久久草成人影院| 看片在线看免费视频| 在线观看66精品国产| 91aial.com中文字幕在线观看| 在线免费观看的www视频| 久久99热这里只有精品18| 久久久久久国产a免费观看| 欧美成人一区二区免费高清观看| 三级毛片av免费| 在线播放国产精品三级| 国产探花极品一区二区| 尤物成人国产欧美一区二区三区| 男女视频在线观看网站免费| 亚洲精品久久国产高清桃花| 国产高清视频在线观看网站| 国产一区亚洲一区在线观看| 成人午夜高清在线视频| 久久久国产成人精品二区| 18+在线观看网站| 免费看a级黄色片| 国产成人aa在线观看| 精品国内亚洲2022精品成人| 天堂中文最新版在线下载 | 精华霜和精华液先用哪个| 日本免费a在线| 成人鲁丝片一二三区免费| 内射极品少妇av片p| 美女大奶头视频| 欧美日韩乱码在线| 国产精品一二三区在线看| 亚洲国产日韩欧美精品在线观看| 又粗又爽又猛毛片免费看| 国产亚洲91精品色在线| 午夜老司机福利剧场| 99热这里只有精品一区| 天天一区二区日本电影三级| 久久欧美精品欧美久久欧美| 亚洲内射少妇av| 美女cb高潮喷水在线观看| 别揉我奶头 嗯啊视频| 免费看光身美女| 只有这里有精品99| 成人特级av手机在线观看| 国内精品宾馆在线| 午夜久久久久精精品| 亚洲av二区三区四区| 亚洲成人中文字幕在线播放| 久久久久久大精品| av视频在线观看入口| 国产精品1区2区在线观看.| 在线观看av片永久免费下载| 插逼视频在线观看| 99久久九九国产精品国产免费| 国产av在哪里看| 中出人妻视频一区二区| 在线观看美女被高潮喷水网站| 九色成人免费人妻av| 边亲边吃奶的免费视频| 丰满的人妻完整版| 日本爱情动作片www.在线观看| 丝袜美腿在线中文| 男人的好看免费观看在线视频| 夜夜夜夜夜久久久久| 99热精品在线国产| 看免费成人av毛片| 黄色配什么色好看| 成人永久免费在线观看视频| 老熟妇乱子伦视频在线观看| 国产高清不卡午夜福利| 欧美日韩精品成人综合77777| 亚洲欧洲日产国产| 午夜免费激情av| 一卡2卡三卡四卡精品乱码亚洲| 青春草视频在线免费观看| 国产精品国产高清国产av| 国产精品综合久久久久久久免费| 日韩精品有码人妻一区| 中国国产av一级| 国产黄a三级三级三级人| 性插视频无遮挡在线免费观看| 中国美女看黄片| 日韩亚洲欧美综合| 国产v大片淫在线免费观看| 波野结衣二区三区在线| 亚洲精华国产精华液的使用体验 | 亚洲精品乱码久久久久久按摩| 国产伦精品一区二区三区视频9| 偷拍熟女少妇极品色| 99在线视频只有这里精品首页| 午夜福利在线观看吧| 精品免费久久久久久久清纯| 欧美xxxx性猛交bbbb| 久久人人爽人人片av| 久久午夜福利片| 狂野欧美白嫩少妇大欣赏| 精品欧美国产一区二区三| 中文字幕免费在线视频6| 久久亚洲国产成人精品v| 亚洲中文字幕日韩| 五月伊人婷婷丁香| 精品午夜福利在线看| 日韩欧美精品免费久久| 女人被狂操c到高潮| 99热精品在线国产| 天堂中文最新版在线下载 | 免费黄网站久久成人精品| 少妇的逼好多水| 国产一区亚洲一区在线观看| 一个人观看的视频www高清免费观看| a级毛色黄片| 一级毛片aaaaaa免费看小| 三级男女做爰猛烈吃奶摸视频| 日本色播在线视频| 欧美一区二区国产精品久久精品| 欧美色视频一区免费| 中文字幕人妻熟人妻熟丝袜美| 精品少妇黑人巨大在线播放 | 菩萨蛮人人尽说江南好唐韦庄 | 美女脱内裤让男人舔精品视频 | av在线播放精品| 国产69精品久久久久777片| 99热精品在线国产| 欧美日韩一区二区视频在线观看视频在线 | 亚洲欧美日韩高清在线视频| 国内精品一区二区在线观看| 精华霜和精华液先用哪个| 中文字幕熟女人妻在线| 国产真实乱freesex| 特大巨黑吊av在线直播| 国产精品不卡视频一区二区| 99久久久亚洲精品蜜臀av| 一本—道久久a久久精品蜜桃钙片 精品乱码久久久久久99久播 | 久久婷婷人人爽人人干人人爱| 校园人妻丝袜中文字幕| 免费在线观看成人毛片| 老女人水多毛片| 小说图片视频综合网站| 国产一区二区三区av在线 | 18+在线观看网站| 久久欧美精品欧美久久欧美| 搡老妇女老女人老熟妇| 在线观看午夜福利视频| 日本一二三区视频观看| 国产精品美女特级片免费视频播放器| 中文字幕精品亚洲无线码一区| 99在线视频只有这里精品首页| 国产精品久久久久久精品电影小说 | 国产精品国产三级国产av玫瑰| 波多野结衣高清作品| 偷拍熟女少妇极品色| АⅤ资源中文在线天堂| 精品人妻一区二区三区麻豆| 久久久久性生活片| 老司机福利观看| 欧美3d第一页| 日韩欧美国产在线观看| 午夜老司机福利剧场| 亚洲国产欧美在线一区| 久久人人精品亚洲av| 1000部很黄的大片| 一级黄色大片毛片| 欧美+日韩+精品| 亚洲欧洲国产日韩| 国产精品福利在线免费观看| 中文字幕av成人在线电影| ponron亚洲| 日产精品乱码卡一卡2卡三| 亚洲成人精品中文字幕电影| 国内揄拍国产精品人妻在线| 国产精品,欧美在线| 久久99热这里只有精品18| 国产一区二区在线av高清观看| 国产高清视频在线观看网站| 久久综合国产亚洲精品| 毛片一级片免费看久久久久| 国产单亲对白刺激| 亚洲无线观看免费| 国产探花极品一区二区| 免费一级毛片在线播放高清视频| 99久久人妻综合| 看免费成人av毛片| 91av网一区二区| 国产国拍精品亚洲av在线观看| 久久国内精品自在自线图片| 黄色欧美视频在线观看| 又粗又爽又猛毛片免费看| 免费观看人在逋| 又爽又黄a免费视频| av福利片在线观看| 国产男人的电影天堂91| 乱系列少妇在线播放| 看片在线看免费视频| 国产乱人视频| 99九九线精品视频在线观看视频| 国产视频内射| 久久鲁丝午夜福利片| 美女大奶头视频| 日韩成人av中文字幕在线观看| 成年av动漫网址| 熟女电影av网| 亚洲国产欧洲综合997久久,| a级毛色黄片| 日韩成人伦理影院| 97人妻精品一区二区三区麻豆| 青青草视频在线视频观看| 日韩一区二区三区影片| 欧美激情国产日韩精品一区| 国产亚洲精品久久久久久毛片| 18+在线观看网站| 人人妻人人澡欧美一区二区| or卡值多少钱| 久久精品国产99精品国产亚洲性色| 久久精品国产亚洲av涩爱 | 亚洲欧美中文字幕日韩二区| 免费av观看视频| 免费观看a级毛片全部| 日日干狠狠操夜夜爽| av黄色大香蕉| 寂寞人妻少妇视频99o| 国产一区二区在线av高清观看| a级毛片免费高清观看在线播放| 精品久久久久久久久亚洲| 亚洲国产欧美人成| 亚洲四区av| 乱码一卡2卡4卡精品| 亚洲国产精品合色在线| 天天躁夜夜躁狠狠久久av| 男人舔女人下体高潮全视频| 亚洲av男天堂| 国语自产精品视频在线第100页| 国产美女午夜福利| 久久午夜亚洲精品久久| 亚洲欧美清纯卡通| 一本一本综合久久| 色播亚洲综合网| 亚洲av中文av极速乱| 村上凉子中文字幕在线| 色吧在线观看| 免费电影在线观看免费观看| 看片在线看免费视频| 欧美激情久久久久久爽电影| 亚洲欧美日韩东京热| 国产午夜福利久久久久久| 在线观看美女被高潮喷水网站| 国内久久婷婷六月综合欲色啪| 国内少妇人妻偷人精品xxx网站| av卡一久久| 亚洲国产精品久久男人天堂| 日本撒尿小便嘘嘘汇集6| 蜜桃亚洲精品一区二区三区| 日本成人三级电影网站| 久久久久久伊人网av| 精品欧美国产一区二区三| 亚洲性久久影院| 亚洲精品粉嫩美女一区| 国产综合懂色| 亚洲自拍偷在线| 国产片特级美女逼逼视频| 国产毛片a区久久久久| 一级毛片电影观看 | 亚洲av熟女| 国产午夜精品一二区理论片| 免费不卡的大黄色大毛片视频在线观看 | 只有这里有精品99| 亚洲欧美日韩高清专用| 99久久九九国产精品国产免费| 国产精品一区二区三区四区久久| 亚洲精品国产av成人精品| 99久久九九国产精品国产免费| 国产精品一区二区三区四区久久| 亚洲国产精品成人久久小说 | 91在线精品国自产拍蜜月| 亚洲在线自拍视频| eeuss影院久久| 国产淫片久久久久久久久| 九草在线视频观看| 成年女人永久免费观看视频| 久久精品国产亚洲av天美| 亚洲av中文av极速乱| 一级毛片我不卡| 亚洲性久久影院| 欧美精品国产亚洲| 91精品国产九色| 欧美精品一区二区大全| 一级毛片电影观看 | 美女大奶头视频| 99九九线精品视频在线观看视频| 午夜久久久久精精品| 日韩欧美精品v在线| 99在线视频只有这里精品首页| 岛国毛片在线播放| 成人特级av手机在线观看| 天堂影院成人在线观看| 精品人妻视频免费看| 日韩亚洲欧美综合| 国产精品.久久久| 亚洲最大成人av| 亚洲av成人精品一区久久| 3wmmmm亚洲av在线观看| 久久精品夜色国产| 亚洲av二区三区四区| 99九九线精品视频在线观看视频| 国产成人a区在线观看| 国产不卡一卡二| 99热6这里只有精品| 99视频精品全部免费 在线| 免费观看人在逋| 久久这里有精品视频免费| 色吧在线观看| 亚洲第一电影网av| 97人妻精品一区二区三区麻豆| 国产精品美女特级片免费视频播放器| 中文在线观看免费www的网站| 国产91av在线免费观看| 日本黄大片高清| 99久久无色码亚洲精品果冻| 欧美潮喷喷水| 亚洲成人中文字幕在线播放| 99久久无色码亚洲精品果冻| 久久久欧美国产精品| 国产精品蜜桃在线观看 | av在线蜜桃| 禁无遮挡网站| 小说图片视频综合网站| 国产成人a∨麻豆精品| 岛国毛片在线播放| 搡老妇女老女人老熟妇| 亚洲人成网站在线观看播放| 深夜精品福利| 真实男女啪啪啪动态图| 男女做爰动态图高潮gif福利片| 最近的中文字幕免费完整| 国产在线精品亚洲第一网站| 国产精品av视频在线免费观看| 啦啦啦啦在线视频资源| 日韩一本色道免费dvd| 麻豆国产av国片精品| 色哟哟哟哟哟哟| 午夜精品在线福利| av天堂中文字幕网| 69av精品久久久久久| 国产成人freesex在线| 亚洲最大成人av| 看片在线看免费视频| 97超视频在线观看视频| 尤物成人国产欧美一区二区三区| 两个人的视频大全免费| 欧美性感艳星| 午夜视频国产福利| 少妇丰满av| 国产成人a∨麻豆精品| 国产免费男女视频| 国产精品日韩av在线免费观看| 啦啦啦观看免费观看视频高清| 色吧在线观看| 日本在线视频免费播放| 91av网一区二区| 麻豆精品久久久久久蜜桃| 国产免费一级a男人的天堂| 一级毛片我不卡| 一个人看的www免费观看视频| 亚洲七黄色美女视频| 色5月婷婷丁香| 噜噜噜噜噜久久久久久91| 午夜视频国产福利| 国产一区二区激情短视频| 久久久a久久爽久久v久久| 日日撸夜夜添| 亚洲第一电影网av| 精品一区二区免费观看| 国产久久久一区二区三区| 亚洲精品成人久久久久久| 久久精品国产99精品国产亚洲性色| 人人妻人人澡欧美一区二区| 一级黄片播放器| 日韩欧美 国产精品| 日韩亚洲欧美综合| 亚洲国产高清在线一区二区三| 成人无遮挡网站| 国产白丝娇喘喷水9色精品| 两个人视频免费观看高清| 尾随美女入室| 成人漫画全彩无遮挡| 中文字幕av在线有码专区| 九色成人免费人妻av| 国产精品一二三区在线看| 成人亚洲欧美一区二区av| 中国国产av一级| 国产av麻豆久久久久久久| 精品久久久久久久久av| 99久久精品一区二区三区| 一本—道久久a久久精品蜜桃钙片 精品乱码久久久久久99久播 | 国产真实乱freesex| 精品熟女少妇av免费看| 美女cb高潮喷水在线观看| 亚洲第一区二区三区不卡| 一本一本综合久久| 我的老师免费观看完整版| 美女内射精品一级片tv| 伊人久久精品亚洲午夜| 亚洲精品色激情综合| 免费看日本二区| 1000部很黄的大片| 欧美激情在线99| 小蜜桃在线观看免费完整版高清| 久久久成人免费电影| 啦啦啦啦在线视频资源| 97人妻精品一区二区三区麻豆| 亚洲欧美精品综合久久99| www日本黄色视频网| 国产精品久久久久久久久免| 亚洲欧美日韩无卡精品| 中文字幕精品亚洲无线码一区| 一个人看视频在线观看www免费| 亚洲欧洲国产日韩| 久久九九热精品免费| 国产精品电影一区二区三区| 亚洲真实伦在线观看| 女人十人毛片免费观看3o分钟| 99久久成人亚洲精品观看| 永久网站在线| a级毛片免费高清观看在线播放| 成人鲁丝片一二三区免费| 网址你懂的国产日韩在线| 亚洲av免费在线观看| 精品无人区乱码1区二区| 国产精品乱码一区二三区的特点| 丰满的人妻完整版| 亚洲国产色片| 亚洲无线观看免费| 99久久精品热视频| av女优亚洲男人天堂| 亚洲精华国产精华液的使用体验 | 国产精品人妻久久久影院| 简卡轻食公司| 免费看光身美女| 国产精品国产三级国产av玫瑰| 国产美女午夜福利| 大香蕉久久网| 波多野结衣高清作品| 日本黄色片子视频| 国产精品1区2区在线观看.| 国产av不卡久久| 国产一级毛片在线| 亚洲电影在线观看av| 嫩草影院精品99| 直男gayav资源| 大型黄色视频在线免费观看| 两个人视频免费观看高清| 国产日韩欧美在线精品| 91aial.com中文字幕在线观看| 国产精品久久视频播放| av专区在线播放| 国内揄拍国产精品人妻在线| 亚洲国产精品成人久久小说 | 亚州av有码| 欧美丝袜亚洲另类| 久久久久久久久久久免费av| 又粗又硬又长又爽又黄的视频 | 别揉我奶头 嗯啊视频| 久久精品人妻少妇| 嫩草影院入口| 内射极品少妇av片p| 性插视频无遮挡在线免费观看| 免费av不卡在线播放| 久久午夜亚洲精品久久| 91午夜精品亚洲一区二区三区| 亚洲成a人片在线一区二区| 哪个播放器可以免费观看大片| 亚洲天堂国产精品一区在线| 亚洲内射少妇av| 91狼人影院| 日日摸夜夜添夜夜爱| 人妻制服诱惑在线中文字幕| 床上黄色一级片| 国产视频内射| 在线观看66精品国产| a级毛片a级免费在线| 精品久久久久久久久久久久久| av在线观看视频网站免费| 午夜福利视频1000在线观看| 久久久久久久久久成人| 一边亲一边摸免费视频| 校园春色视频在线观看| 日韩精品青青久久久久久| 插阴视频在线观看视频| 伊人久久精品亚洲午夜| 久久久久久国产a免费观看| 亚洲第一电影网av| 日本与韩国留学比较| 亚洲欧美中文字幕日韩二区| 国产高清不卡午夜福利| 一夜夜www| 国产在线男女| 国产成年人精品一区二区| 亚洲人成网站在线播| 久久久午夜欧美精品| 在现免费观看毛片| 亚洲av二区三区四区| 一个人免费在线观看电影| 国产av麻豆久久久久久久| 久久久久久伊人网av| 欧美成人免费av一区二区三区| 欧美+亚洲+日韩+国产| 久久久久九九精品影院| 老熟妇乱子伦视频在线观看| av在线天堂中文字幕| 一级毛片aaaaaa免费看小| 97超视频在线观看视频| 国内久久婷婷六月综合欲色啪| 少妇人妻一区二区三区视频|