• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    DNEF:A New Ensemble Framework Based on Deep Network Structure

    2024-01-12 03:48:44SiyuYangGeSongYuqiaoDengChangyuLiuandZhuoyuOu
    Computers Materials&Continua 2023年12期

    Siyu Yang ,Ge Song,? ,Yuqiao Deng ,Changyu Liu and Zhuoyu Ou

    1College of Mathematics and Informatics,South China Agricultural University,Guangzhou,510642,China

    2School of Statistics and Mathematics,Guangdong University of Finance and Economics,Guangzhou,510120,China

    ABSTRACT Deep neural networks have achieved tremendous success in various fields,and the structure of these networks is a key factor in their success.In this paper,we focus on the research of ensemble learning based on deep network structure and propose a new deep network ensemble framework(DNEF).Unlike other ensemble learning models,DNEF is an ensemble learning architecture of network structures,with serial iteration between the hidden layers,while base classifiers are trained in parallel within these hidden layers.Specifically,DNEF uses randomly sampled data as input and implements serial iteration based on the weighting strategy between hidden layers.In the hidden layers,each node represents a base classifier,and multiple nodes generate training data for the next hidden layer according to the transfer strategy.The DNEF operates based on two strategies:(1)The weighting strategy calculates the training instance weights of the nodes according to their weaknesses in the previous layer.(2) The transfer strategy adaptively selects each node’s instances with weights as transfer instances and transfer weights,which are combined with the training data of nodes as input for the next hidden layer.These two strategies improve the accuracy and generalization of DNEF.This research integrates the ensemble of all nodes as the final output of DNEF.The experimental results reveal that the DNEF framework surpasses the traditional ensemble models and functions with high accuracy and innovative deep ensemble methods.

    KEYWORDS Machine learning;ensemble learning;deep ensemble;deep network structure;classification

    1 Introduction

    Nowadays,machine learning and artificial intelligence have been applied to various fields with remarkable success.Ensemble learning is of great interest among machine learning and Artificial intelligence methods,because of its ability to combine numerous learners to achieve significantly improving the accuracy and generalization performance of the model.Dietterich demonstrated the performance advantages of ensemble learning over a single classifier in statistical,computational,and representational ways [1].In addition,base classifiers in the ensemble model are typically weak classifiers,which are more trainable than a single strong classifier.

    The main challenges in designing ensemble models are the diversity of sub-learners and the ensemble strategies.(1) The diversity of sub-learners: diversity prompts the provision of complementary information among sub-learners[2] and is more advantageous in improving the performance of the ensemble model [3].To maintain the diversity of sub-learners,most ensemble algorithms construct different types of sub-learners on the same instance space or the same type of sub-learners on different instance space (e.g.,ensemble methods based on bagging,including Random Forest (RF)[4],Bagging of Extrapolation Borderline-SMOTE SVM(BEBS)[5],etc...)(2)Ensemble strategies for base classifiers.Typical ensemble strategies include boosting[6]and the base classifier weight-based strategy.Boosting methods improve the classification ability of misclassified instances by adjusting the instance weights to obtain a higher performance of the ensemble model.The sub-classifier weightbased strategy improves the performance of the ensemble model by adjusting the weights of the base classifiers.

    Traditional algorithms often adopt a single strategy,ignoring the complex representation of the ensemble model.In this paper,we propose the Deep Network Ensemble Framework (DNEF),which draws on the concept of deep learning and the structure of the neural network.The DNEF framework adopts a hierarchical network structure that combines various ensemble strategies to improve both the diversity of base classifiers and the generalization ability of the ensemble framework.The novelty of the proposed DNEF includes: (1) Construction of different types of base classifiers on various instance subspaces to maximize the diversity of base classifiers in the hidden layer.(2) Combining the boost ensemble strategy and the metric-based ensemble strategy to optimize instance distribution,improve generalization capability,determine base classifier weights,and transfer misclassified instances between the hidden layers.(3) Adoption of an intra-layer parallel inter-layer serial training model to enhance training efficiency.(4) Comparative experiments show that our proposed DNEF achieves significant improvement in classification accuracy compared to popular ensemble algorithms.

    2 Related Works

    Our approach is related to ensemble learning.We briefly describe the related work including the present ensemble learning and deep ensemble learning in the following sub-sections.

    2.1 Traditional Ensemble Learning

    Ensemble learning is one of the highly effective learning algorithms in machine learning.The procedure of ensemble learning involves combining multiple weak learners according to a certain strategy into a strong learner with higher predictive performance[7].Compared to individual algorithms,it has been found that ensemble learning could combine the advantages of each weak learner to represent complex models more accurately,reducing overfitting and providing higher learning performance.Thus,ensemble learning has a wide range of applications for complex data models such as loan approval[8],speech recognition[9],image recognition[10],and industrial process simulation[11].

    As mentioned above,the main challenges in designing ensemble models are the diversity of base classifiers and the ensemble strategy.Several typical Ensemble Learning models are described below.

    2.1.1 Random Forest

    Random Forest is a model for classification and regression that combines the“bagging”method with random feature selection.Random Forest employs the bagging method to select instances to train different random decision trees,which are then combined based on an ensemble strategy.

    Current improvements to Random Forest have focused on increasing the diversity of the trees by adding random factors (Menze [12],Zhang et al.[13–15]) and on designing ensemble strategies.For example,Utkin et al.proposed optimizing weights in terms of forest accuracy [16,17].Random Forest has been widely used in different classification scenarios in recent years.In the industrial scenario,Paul et al.used the improved Random Forest for the classification of dual-phase(DP)steel microstructures[18].In Intrusion Detection Systems,Resende et al.proposed that the Random Forest has the advantage of speed and efficiency over common machine learning methods[19].

    2.1.2 AdaBoost

    AdaBoost[20]is a highly popular boosting learning algorithm for improving classification accuracy.It iteratively constructs multiple weak classifiers,with each classifier improving its performance by adjusting weights.During each iteration,AdaBoost increases the weights of the instances that were misclassified in the previous iteration,directing the classifier’s attention towards these instances in subsequent rounds,ultimately leading to higher accuracy.In the realm of chemical detection,Chen et al.found that AdaBoost can be used simultaneously to determine the trace copper and cobalt in high-concentration zinc solution[21].In the field of electrocardiogram(ECG)classification,Barstu?gan et al.employed AdaBoost as a classifier based on a dictionary,using it to classify ECG signals[22].

    2.1.3 Stacking

    Stacking[23–25]is a hierarchical ensemble learning framework.Stacking methods learn some base classifiers(usually,these learners are of different types)using the initial training data and employ the predictions generated by these base classifiers as a new training set to train a new classifier.Stacking can improve accuracy by reducing the bias in data.Stacking technology has made significant progress in a large number of areas of application.In mining engineering,Koopialipoor et al.predicted rock deformation using the structure of a stacked tree,Random Forest(RF),K-Nearest-Neighbors(KNN),and Multilayer Perceptron(MLP)[26].For diabetes identification,Kalagotla et al.developed a novel stacking technique with a multi-layer perceptron,support vector machine,and logistic regression to predict diabetes[27].

    2.1.4 Weighted-Based Voting

    The voting strategy,as a decision rule,is important in ensemble learning.Typically,the predictions are made through majority voting.The voting strategy can be generally divided into weighted and unweighted voting.Among them,weighted voting is a commonly employed approach.Weighted voting can be further divided into dynamic weighted voting [28] and static weighted voting.Voting technology has become well-established in a variety of classification scenarios.In the domain of tuberculosis prediction,Okezie et al.used the weighted voting ensemble technique to improve the accuracy of tuberculosis diagnosis [29].For Business failure prediction,Kim et al.used a majority voting ensemble method with a decision tree to predict business failure[30].

    2.2 Deep Ensemble Learning

    After the significant development of deep learning,there has been a widespread focus on how to combine deep learning and ensemble learning to leverage their advantages.Deep ensemble learning encompasses the design of ensemble models and fusion mechanisms.By aggregating multiple deep learning models,the hierarchical relationships between these models are used to achieve more powerful feature extraction and representation capabilities.In this section,we begin by introducing ensemble learning based on neural network modules,followed by an introduction to deep forest.

    Sivakumar et al.proposed a deep learning-based graph neural network (DL-GNN) based on the ensemble of recurrent neural networks (RNN) and feedforward neural networks (FNN) [31].Devarajan et al.introduced a deep learning model integrated with natural language processing (NDCBL),which combines a convolutional neural network (CNN),Bidirectional Long Short-Term Memory (Bi-LSTM),and Attention [32].José et al.presented an automated deep learning-based breast cancer diagnosis model (ADL-BCD) for breast cancer diagnosis using digital mammograms.ADL-BCD is an ensemble model of Gaussian filter(GF)-based preprocessing,Tsallis entropy-based segmentation,ResNet34 based feature extraction,chimp optimization algorithm(COA)based parameter tuning,and the wavelet neural network(WNN)based classification[33].Hussain et al.developed an Ensemble Deep-Learning-Enabled Clinical Decision Support System for Breast Cancer Diagnosis(EDLCDS-BCD).The EDLCDS-BCDC ensemble of VGG-16,VGG-19,and SqueezeNet for feature extraction[34].

    Zhou et al.[35]proposed the Deep Forest(DF)model,which is a multi-grained cascade structure.DF typically comprises two steps: the multi-grained scanner and the cascade forest.The former extracts information from raw data while the latter constructs an adaptive-depth ensemble model.DF is a model that combines ensemble learning and deep learning.On the one hand,it is an ensemble learning method based on the decision tree,inheriting most of the advantages of general ensemble learning methods.On the other hand,DF also has the advantages of deep neural networks to increase the diversity of ensemble learning and improve model performance.Subsequently,the deep forest methods have been improved,such as siamese deep forest[36],imprecise deep forest for classification[37],and applied to various domains,including software defect prediction,price prediction,prediction of protein interactions,anti-cancer drug response prediction[38–41].

    2.3 Characteristics of DNEF

    Our deep ensemble framework,DNEF,combines the structure of deep networks with ensemble learning.Specifically,we use a weighting strategy to facilitate the iteration of DNEF and a transfer strategy to ensure the architectural complexity of the DNEF.Popular ensemble models often employ a single strategy,such as RF,AdaBoost,etc.,while DNEF employs two ensemble strategies:the weighting strategy and the transfer strategy.These two strategies take into account both the relationship between classifiers and the complexity of the ensemble structure.Compared to the deep ensemble models based on the neural network,DNEF does not incorporate deep learning modules and does not use neural networks and backpropagation.The weighting strategy in DNEF supports the iteration of the hidden layer,while the transfer strategy enhances the connectivity of the hidden layer.These key differences represent the biggest differentiating factor between DNEF and deep ensemble learning.

    3 The Proposed Method

    In this section,we introduce our proposed DNEF framework.We first describe the DNEF in Section 3.1.We then explain the weighting strategy and transfer strategy of DNEF in Section 3.2.

    3.1 Introduction of DNEF

    Inspired by deep neural networks,we propose the DNEF framework,which employs a deep network architecture.From Fig.1,the DNEF framework includes an input layer,hidden layers,and an output layer.The data in the input layer consists of randomly sampled instances X0.The DNEF framework iterates between hidden layers based on a weighting strategy.Each hidden layer containsmnodes,and each node represents a base classifierh(in this paper,we use the decision tree as the base classifier).The inputs to the node are instances with their weights.In the hidden layer interior,each node generates transfer instances and corresponding weights based on the transfer strategy.These transfer instances and transfer instance weights of all nodes are aggregated with the input data to produce the output data of the next layer.The output of DNEF is obtained through a weighted ensemble of classifiers across all layers.

    Figure 1:The architecture of DNEF

    It is worth noting that the“ensemble”in this DNEF is based on a deep network structure.Two factors contribute to the accuracy and robustness of DNEF:Firstly,the hidden layer of DNEF builds more accurate base classifiers by utilizing the results of the previous layer.Secondly,independent classifiers are trained in parallel in the hidden layer to reduce variance.

    It is also worth mentioning that the key issues are the calculation of instance weights and the transfer strategy that directly affect the performance of the DNEF.As one of the main contributions of this paper,we propose the weighting strategy and the transfer strategy.

    The procedures for the weighting strategy and transfer strategy are described below:

    1.The procedure of weighting strategy:

    (1) Calculation of the base classifier weightsWc: We compute the weight of the base classifier based on its current class errore.

    (2)Calculation of instance weightsWs:The instance weightsWsare calculated based on theWc.

    2.The procedure of transfer strategy:

    (1)Selecting transfer instancesX′and transfer instance weightsWs′:We select transfer instances and their weights based on the size of training instances and the accuracy of the base classifiers.

    (2)TransferringX′with transfer instance weightsWs′to the next layer:We Combine the transfer instancesX′of each base classifier with their weightsWs′into the input data of each base classifier to form the output data.It should be noted that the input data of the base classifier does not include its own transfer instances andtransfer instance weights.

    After describing the weighting strategy and the transfer strategy,we show the process of the DNEF framework.Each node in the input layer receives randomly sampled dataD0jas its input:

    whereX∈Rn,X0jrepresents the training instances of j-th node in the input layer,is the instance weights ofX0jandyrepresents the labels of instancesy∈Y={1,-1}.mrepresents the total number of nodes in the hidden layer.D0represents the input data of the input layer.

    In the i-th hidden layer,Dijrepresents the training data of the j-th node.The weightsof base classifiersare calculated using the weighting strategy.The training dataDi+1of the next hidden layer are generated according to the transfer strategy.

    whereXijrepresents the training instances of j-th node in the i-th layer,is the instance weights ofXij.

    The output of DNEFfis to ensemble the base classifiersHwith weightsWcthat were trained by the deep network structure.

    Algorithm 1 gives the process of DNEF,which combines the weighting strategy and the transfer strategy in hidden layers.

    3.2 The Strategies of DNEF

    In this section,we illustrate the two main strategies of the proposed DNEF framework.In Section 3.2.1,we explain the weighting strategy for the DNEF,followed by a description of the transfer strategy in Section 3.2.2.

    3.2.1 DNEF’s Weighting Strategy

    In this subsection,we describe the weighting strategy used in the DNEF framework.Firstly,given the training data for the j-th node in the i-th hidden layerDij,The base classifierhjis trained using the instancesXij,with each instance being assigned the weightsNext,the instance weightsare calculated based on the classifier errore.The computing procedure of these weights is as follows:1.Base classifier weight2.Instance weightsAlgorithm 2 illustrates the weighting strategy employed by the DNEF in the i-th hidden layer.

    We first normalize the instance weights:

    Then,the erroreof the base classifier is calculated based on the classification result of the classifier

    The base classifier weight ofhjis expressed as

    whereγis the threshold value we set.αγis a constant needed so that the equation is continuous[42].

    When considering the weight ofhj,the higher-performing base classifier should have a larger weight.Therefore,we use the square root function to scale the weights of oversized classifiers:

    After calculating the classifier weightswe update the instance weightsas follows:

    3.2.2 The Adaptive Transfer Strategy in DNEF

    We design a rule for the base classifier to adaptively choose transfer instances X′and transfer instance weights Ws′.We then add transfer instances X′with their weights Ws′to the input as output data.

    The new training dataD(i+1)jis obtained:

    We obtainDi+1by executing the transfer strategy on all nodes of the i-th hidden layer.

    The transfer strategy dynamically selects the misclassified instances with weights into the next layer of training data.Algorithm 3 describes the transfer strategy.

    4 Experiments

    The experimental environment used in this study consisted of an Intel(R)Xeon(R)CPU E5-2640 v3 and Tesla K80 GPU 11 GB memory.All the experiments were conducted in a Python environment.The required Python libraries for carrying out the experiments were Sklearn,Numpy,Pandas,and Pytorch.The selected methods for providing comparison results included DNEF,Random Forest(RF),AdaBoost,Linear Support Vector Machine(SVM),MLP,and Voting(Decision Tree).

    4.1 Datasets

    The performance of the DNEF was validated using four datasets of varying dimensions.The Adult dataset has 14 features,the REJAFADA dataset with 6824 features,the Segmentation dataset has 19 features,and the IMDB dataset has 74849 features.We downloaded the Adult dataset,REJAFADA,and Segmentation datasets from the UCI machine learning repository and the IMDB dataset from the Stanford repository.

    Table 1 gives the class,instance size,and dimensional information of these datasets.

    Table 1:Overview of the datasets

    The ADULT dataset comprises 14 features and 48842 instances.The problem is to classify whether the income exceeds$50K/year.

    The REJAFADA dataset contains 6824 features and 1996 instances.The problem it addresses is classifying files between benign and malware.

    The IMDB dataset with 74849 features and 50,000 instances.The IMDB dataset is represented by tf-idf features with positive and negative labels.

    The Segmentation dataset consists of 19 features and 2310 instances.Its objective is to classify seven distinct scenarios.

    4.2 Experimental Settings

    All results for each model were calculated using the same data set and random seeds for the metrics.The DNEF was set up as follows: (1) Each hidden layer consisted of three nodes.(2) The thresholdsγ,ε,μandρof DNEF were set at 0.15%,0.9%,150%,and 10%.Here,γrepresents the threshold for base classifier error,εrepresents the threshold for accuracy,μrepresents the threshold for instance size and μ represents the percentage of the transfer weights.The base classifier in DNEF can be any classification model with instance weights.To ensure fairness in our evaluation,we employed the same decision tree as RF,AdaBoost,and Voting for the base classifier of DNEF.The parameters for SVM and Voting remained consistent across all datasets.Specifically,for Hard Voting,we employed a decision tree with the same number and depth as DNEF,while SVM used default parameters for all datasets.

    4.2.1 The Adult Dataset

    We used 20 hidden layers DNEF whose base classifier is a 6-depth decision tree.We compared with an MLP with structure input-30-20-output and a“Sigmoid layer”is added in the last.Both AdaBoost and Random Forest adopt default parameters from the Sklearn library,employing 50 classifiers and 100 classifiers,respectively.

    4.2.2 The REJAFADA Dataset

    We used a 20 hidden layers DNEF whose base classifier is a 5-depth decision tree.In addition,we compared it with an MLP having two hidden layers,with 512 and 256 units.For both AdaBoost and Random Forest,we employed 50 and 100 classifiers,respectively.

    4.2.3 The IMDB Dataset

    We use a 50 hidden layers DNEF whose base classifier is a 10-depth decision tree.We increased the depth of the decision tree to deal with high-dimensional data while improving DNEF performance.We compared it with an MLP with structure input-1024-512-output.AdaBoost and Random Forest had 150 and 200 classifiers,respectively.

    4.2.4 The Segmentation Dataset

    We used a 20 hidden layers DNEF whose base classifier is a 5-depth decision tree.In comparison,we employed an MLP with a structure of input-256-128-output.We used 150 classifiers and 200 classifiers respectively for AdaBoost and Random Forest.

    4.3 Results and Analysis

    In this section we analyze the performance of DNEF based on the experimental results then we change the number of base-classifiers in the hidden layer to analyze the DNEF.

    4.3.1 Overall Performance

    We describe the performance of DNEF in general according to the figures.

    Fig.2 displays the performance of DNEF and the baseline model on each metric on the Adult dataset.As shown in Fig.1,DNEF performed the best in the Adult dataset,with the highest scores in Accuracy,AUC,F1,and Recall,although with slightly lower Precision metrics.

    Fig.3 demonstrates that DNEF and the baseline model performed on each metric on the REJAFADA dataset.DNEF achieved the best scores in the REJAFADA dataset,and the scores of Accuracy,AUC,and F1 indicators exceeded 0.98.

    Fig.4 shows the scores of DNEF and the baseline model on the IMDB dataset.In the IMDB highdimensional sparse dataset,all ensemble models with the decision tree as base classifier performed worse than MLP and SVM.In this case,DNEF performed better than the three ensemble models of RF,AdaBoost,and Voting.

    Fig.5 illustrates the scores of DNEF and the baseline model for the Segmentation dataset.DNEF performed well in multi-class classification.DNEF outperformed RF,AdaBoost,SVM,and MLP in terms of Accuracy,F1 score,Precision,and Recall.Next,we will analyze in detail the specific performance of DNEF on each dataset.

    Figure 2:DNEF performance in the adult dataset

    Figure 3:DNEF performance in the REJAFADA dataset

    Figure 4:DNEF performance in the IMDB dataset

    Figure 5:DNEF performance in the segment dataset

    4.3.2 The Adult Dataset

    Table 2 provides a detailed performance comparison of DNEF and other models.DNEF achieved first place with an accuracy rate of 86.18%,slightly higher than the second AdaBoost by 0.09%.Moreover,DNEF achieved the highest Recall among all models,with a 1.98% higher than RF.In terms of the F1 score,DNEF was 1.01%higher than the second-place AdaBoost.It can be seen from the AUC that DNEF was least affected by the number of positive and negative instances,which was 0.94% higher than AdaBoost.SVM and MLP were prone to overfitting on the Adult dataset with small feature dimensions;therefore,they were not as effective as DNEF.Voting (decision tree) had the worst ability to recognize negative instances,so the recall score was significantly lower than other models.

    Table 2:Classification performance in the adult dataset

    In the Adult dataset,the weighting strategy made DNEF more accurate than RF,SVM,and MLP.DNEF has achieved the highest recall score,possibly due to the transfer strategy that improves the recognition ability of positive instances and also favors the prediction of instances as positive classes,resulting in a decrease in Precision.DNEF not only surpassed AdaBoost in terms of accuracy but also led in F1 and AUC.We can conclude that the transfer strategy effectively enhanced DNEF’s ability to learn from data,making it an excellent classification framework.

    4.3.3 The REJAFADA Dataset

    Table 3 shows a detailed analysis of DNEF’s performance on the REJAFADA data.Compared to the other baseline models,DNEF achieved first place(98.09%)in experimental results in Accuracy,F1,and AUC.When all baselines achieved an accuracy of 98%or less,DNEF broke through to rank first with 98%,improving by 0.35%over the second-place RF and 0.65%over the third-place SVM.Although SVM has the highest Recall,its Precision was 2.55%lower than DNEF.

    Table 3:Classification performance in the REJAFADA dataset

    Since the baseline models had high accuracy rates of over 97%,the challenge of classification was concentrated in a small number of indistinguishable instances.DNEF effectively handled these tricky instances,benefiting from its transfer strategy.DNEF used the square root to scale down the weights of base classifiers with high accuracy,effectively solving the frequent overfitting problem.Among the three ensemble learning models,AdaBoost and Voting,DNEF outperformed them in terms of Accuracy,AUC,and F1 metrics.This superior performance was attributed to DNEF’s effective weighting strategy,which ensures accurate classification,and its transfer strategy,which enhances the recognition ability for samples from various classes.Consequently,DNEF exhibited exceptional performance in F1 and AUC.SVM and MLP had a significant bias in their ability to recognize positive and negative instances,leading to lower F1 scores.

    4.3.4 The IMDB Dataset

    In this subsection,we evaluated the performance of DNEF on the IMDB dataset based on Table 4 Regarding Accuracy,DNEF improved by 0.21% and 0.98% compared to RF and AdaBoost.The Precision of DNEF was slightly better than RF by 0.17%.Although the recall score of DNEF was 1.12% lower than that of AdaBoost,DENF scored higher than AdaBoost on all other metrics.In terms of F1 metrics,DNEF’s Precision and Recall combined exceed all ensemble models.Additionally,DNEF achieves the highest AUC score among all ensemble models,surpassing RF by 0.21%.

    Table 4:Classification performance in the IMDB dataset

    IMDB datasets were represented by tf-idf features with high-dimensional sparsity.According to Table 4,DNEF was still better than the ensemble learning models with decision trees such as RF,AdaBoost,and Voting in processing tf-idf feature data.The inferior performance of Voting suggested that the decision tree was inadequate for handling the IMDB dataset.Comparing the boosting ensemble strategy of AdaBoost with the bagging ensemble strategy of RF,we can obtain that DNEF surpasses these two representative ensemble models in terms of Accuracy,AUC,F1,and so on.SVM and MLP performed well on the IMDB dataset,which can be attributed to their ability to handle high-dimensional sparse data.

    4.3.5 The Segmentation Dataset

    In this subsection,we present an analysis of DNEF’s performance based on the results summarized in Table 5.Notably,DNEF demonstrated a significant advantage in multi-class classification.It achieved the highest Accuracy,F1,Precision,and Recall among the evaluated models.Specifically,in terms of Accuracy,DNEF surpassed the second-place RF by a margin of 0.97%,reaching 97.32%.Moreover,DNEF exhibited exceptional performance in F1,outperforming RF by 0.97%.Regarding Precision and Recall,DNEF consistently delivered the best results.The AUC of DNEF was only 0.04%lower than that of RF,underscoring its competitive performance across all measured metrics.

    Table 5:Classification performance in the segmentation dataset

    DNEF demonstrated strong performance in multi-class classification tasks,likely due to the effective utilization of its weighting strategy in conjunction with its transfer strategy.DNEF effectively addressed the complexity of multi-class classification through its weighting strategy.By adapting the classifier and instance weights for each class,DNEF further enhanced its accuracy.This tailored adjustment ensured that DNEF could effectively handle the intricacies associated with multi-class classification.Notably,DNEF maintained high classification accuracy while excelling in the F1 metric.This achievement was attributed to its transfer strategy,which intelligently guided indistinguishable instances from each class to the next hidden layer during operation.Consequently,DNEF maintained excellent Precision and Recall,further solidifying its suitability for multi-class classification tasks.

    4.3.6 Parameter Analysis

    The number of nodes in the hidden layer is an important parameter of DNEF.This subsection analyzes the impact of the number of nodes in the hidden layer on the performance of DNEF.The number of nodes in each hidden layer of DNEF was changed to observe the different effects during training.Fig.6 displays the training loss of DNEF with three different numbers(1,2,3)of nodes per hidden layer on the Adult dataset.As the number of nodes in the hidden layer increased,the loss line became smoother,and the training loss decreased in Fig.6.We analyzed the details of the three lines in Fig.6 as follows:

    Figure 6:The training loss of DNEF with three different numbers of nodes

    The gray line represents only one classifier in each layer of DNEF.In this case,the transfer strategy did not work.The training loss of the gray line decreased from 4.92 to 4.71.The loss decreased in an oscillating manner as the number of layers increased.Compared to the other lines,the gray line was the most oscillating and had a higher loss in the same number of hidden layer cases.The orange lines represented two nodes in each hidden layer in DNEF,in which case the transfer strategy worked.The training loss dropped from 4.81 to 4.66.Compared to the gray line,we could see that the orange line dropped more smoothly,and the loss was smaller with the same number of hidden layers.The blue line represents three nodes in each layer in DNEF.The blue line dropped from 4.77 to 4.51,which was the smallest of all the lines.From Fig.6,the loss drop line was smoother than the other lines,and the training loss was consistently smaller.It showed that the weighting strategy worked better with the transfer strategy as the number of nodes in the hidden layer increased.This effectively reduced the training loss and made the training smoother.

    4.4 Experiment Conclusion

    We compare DNEF with the baseline algorithm on four datasets.Our experimental results demonstrate that DNEF outperforms the traditional ensemble model.DNEF significantly improved accuracy on the Adult,REJAFADA,and Segmentation datasets with F1 metrics compared to the baseline model.In the case of the high-dimensional dataset IMDB,DNEF outperforms traditional ensemble models in Accuracy,F1,AUC,and Precision.DNEF shows excellent performance,again proving its strength in multi-class classification.At the same time,we can infer the limitations of DNEF,as it is not directly comparable to MLP and SVM in high-dimensional data when using the decision tree as the base classifier.This is also the challenge of DNEF at present.It can be concluded from the parameter analysis that increasing the number of nodes benefits the effective implementation of DNEF’s transfer and weighting strategy,ultimately enhancing training effectiveness and stability.

    DENF framework uses the weighting strategy between hidden layers and the transfer strategy in hidden layers,which is our main difference from other ensemble models.From the experimental results and parameter analysis,we conclude that it is the network structure of DNEF that contributes to its superior performance compared to other traditional ensemble models.

    5 Conclusion

    In this paper,we propose DNEF,a new ensemble learning architecture,DNEF incorporates a deep network structure that iterates between hidden layers and trains classifiers in parallel within the hidden layer.The weighting strategy trains the classifier based on the instance weights generated in the previous layer and further adjusts the instance weights;the transfer strategy operates within the hidden layer,selecting instances with weights for each node in the layer and combining them with the training data for the next layer.Compared to popular ensemble learning approaches,DNEF accounts for the relationship between base classifiers using the weighting strategy and enhances model complexity with the transfer strategy.The DNEF demonstrated promising results on four real datasets.Specifically,on the multi-class dataset Segmentation,DNEF achieved 0.9732 in Accuracy and F1,respectively.These experimental findings validate that DNEF,as a novel and exceptional deep ensemble architecture,outperforms traditional ensemble models.The primary contribution of this paper lies in exploring the ensemble structure under the network structure and expanding the ideas for ensemble learning.In our future work,we will extend DNEF in two ways:(1)We will explore various base classifiers for highdimensional datasets.(2)We will also consider applying DNEF to different data such as images,sound signals,etc.,and other learning tasks such as semi-supervised learning and incremental learning.

    Acknowledgement:We thank the School of Mathematics and Information of South China Agricultural University for supporting this study.

    Funding Statement:This work is supported by the National Natural Science Foundation of China under Grant 62002122,Guangzhou Municipal Science and Technology Bureau under Grant 202102080492,and Key Scientific and Technological Research and Department of Education of Guangdong Province under Grant 2019KTSCX014.

    Author Contributions:Study conception and design:Ge Song;data processing:Changyu Liu,Zhuoyu Ou;analysis and interpretation of results: Ge Song,Yuqiao Deng,Siyu Yang;draft manuscript preparation: Siyu Yang,Ge Song.All authors reviewed the results and approved the final version of the manuscript.

    Availability of Data and Materials:Four publicly datasets were used for analyzing our model.They can be found at https://archive.ics.uci.edu and https://ai.stanford.edu/~amaas/data/sentiment.

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.

    久久久a久久爽久久v久久| 日韩成人伦理影院| 成人性生交大片免费视频hd| 亚洲av成人av| 身体一侧抽搐| 久久精品熟女亚洲av麻豆精品 | 亚洲欧美精品专区久久| 黄色配什么色好看| 亚洲不卡免费看| 搡老妇女老女人老熟妇| 国内揄拍国产精品人妻在线| 三级经典国产精品| 老司机福利观看| 亚洲va在线va天堂va国产| 亚洲国产欧美人成| 欧美bdsm另类| 国产伦精品一区二区三区四那| 免费观看的影片在线观看| 国产一区二区亚洲精品在线观看| 久久精品人妻少妇| 在线播放无遮挡| 久久久久久久亚洲中文字幕| 日本欧美国产在线视频| 少妇的逼好多水| 亚洲自拍偷在线| 久久久午夜欧美精品| 亚洲成人精品中文字幕电影| 白带黄色成豆腐渣| 国产伦在线观看视频一区| 高清在线视频一区二区三区 | 国产在线一区二区三区精 | 成人无遮挡网站| 免费观看性生交大片5| 日本免费a在线| 久久99热这里只频精品6学生 | 欧美色视频一区免费| 亚洲国产最新在线播放| 国产视频首页在线观看| 国产一区二区在线av高清观看| 在线观看一区二区三区| 亚洲乱码一区二区免费版| 免费看a级黄色片| 国产免费一级a男人的天堂| 三级经典国产精品| 女人久久www免费人成看片 | 色综合站精品国产| 亚洲av免费在线观看| 97人妻精品一区二区三区麻豆| 亚洲精品乱码久久久久久按摩| 国产极品精品免费视频能看的| 国内精品一区二区在线观看| 亚洲国产日韩欧美精品在线观看| 联通29元200g的流量卡| 超碰97精品在线观看| 99热这里只有是精品在线观看| 99在线视频只有这里精品首页| 国产淫片久久久久久久久| 免费搜索国产男女视频| 综合色丁香网| 欧美性猛交╳xxx乱大交人| 听说在线观看完整版免费高清| 国产乱人视频| 成人毛片60女人毛片免费| h日本视频在线播放| 国产探花极品一区二区| av专区在线播放| 久久久久久久国产电影| 精品久久久久久成人av| 51国产日韩欧美| 99在线视频只有这里精品首页| 国产精品美女特级片免费视频播放器| 在现免费观看毛片| 久久国内精品自在自线图片| 午夜免费激情av| 偷拍熟女少妇极品色| 丰满乱子伦码专区| 国产男人的电影天堂91| 国产精品一区二区三区四区久久| 成人毛片a级毛片在线播放| 亚洲欧美日韩东京热| 国产伦一二天堂av在线观看| 在现免费观看毛片| 久99久视频精品免费| 最近2019中文字幕mv第一页| 国产麻豆成人av免费视频| 99热全是精品| 插阴视频在线观看视频| 两个人视频免费观看高清| 亚洲av免费在线观看| 久久这里只有精品中国| 自拍偷自拍亚洲精品老妇| 深夜a级毛片| 国产精品国产三级国产av玫瑰| 国产免费福利视频在线观看| 日本熟妇午夜| 国产色婷婷99| 三级国产精品片| 久久韩国三级中文字幕| 一区二区三区乱码不卡18| 日韩高清综合在线| 亚洲高清免费不卡视频| 久久久久九九精品影院| 色视频www国产| 国产黄色视频一区二区在线观看 | 国内精品一区二区在线观看| 神马国产精品三级电影在线观看| 神马国产精品三级电影在线观看| 午夜a级毛片| 欧美丝袜亚洲另类| 村上凉子中文字幕在线| 午夜福利在线在线| 人人妻人人澡欧美一区二区| av在线亚洲专区| 亚洲无线观看免费| 国内精品一区二区在线观看| 亚洲经典国产精华液单| 国产午夜福利久久久久久| 中文字幕av成人在线电影| 波多野结衣高清无吗| 国产成人91sexporn| 久久久久久大精品| 97热精品久久久久久| 能在线免费看毛片的网站| 午夜老司机福利剧场| 国产单亲对白刺激| 一级毛片aaaaaa免费看小| 国产精品国产高清国产av| 欧美不卡视频在线免费观看| 久久精品人妻少妇| 国产午夜精品久久久久久一区二区三区| 国产白丝娇喘喷水9色精品| 国产精品99久久久久久久久| 联通29元200g的流量卡| 午夜精品一区二区三区免费看| 亚洲国产精品国产精品| 亚洲欧美精品综合久久99| 成人性生交大片免费视频hd| 国产成人福利小说| 日本午夜av视频| 人体艺术视频欧美日本| www.av在线官网国产| 久久人妻av系列| 在线播放国产精品三级| 中文字幕免费在线视频6| 人妻系列 视频| 精品一区二区三区视频在线| 欧美zozozo另类| 国产精品女同一区二区软件| 国产亚洲最大av| 亚洲av免费在线观看| 国产欧美另类精品又又久久亚洲欧美| 国产精品av视频在线免费观看| 97人妻精品一区二区三区麻豆| 亚洲自拍偷在线| 亚洲成人精品中文字幕电影| av在线观看视频网站免费| 久久精品人妻少妇| 国产高潮美女av| 国产高潮美女av| 中文字幕av成人在线电影| 又粗又爽又猛毛片免费看| 国产精品久久久久久久电影| 国产精品1区2区在线观看.| 波野结衣二区三区在线| 欧美人与善性xxx| 日韩国内少妇激情av| 尾随美女入室| 赤兔流量卡办理| 波多野结衣巨乳人妻| 美女被艹到高潮喷水动态| 国产免费又黄又爽又色| 午夜福利成人在线免费观看| 一级av片app| 99热这里只有精品一区| 成人亚洲精品av一区二区| 九九爱精品视频在线观看| 高清毛片免费看| 午夜激情福利司机影院| 国产亚洲5aaaaa淫片| av在线观看视频网站免费| 亚洲欧美精品专区久久| 亚洲精品成人久久久久久| 男的添女的下面高潮视频| 亚洲av中文av极速乱| 国产午夜精品久久久久久一区二区三区| 中文欧美无线码| 黄片wwwwww| 久久久久久久久大av| 欧美精品国产亚洲| 国产精品三级大全| www.色视频.com| 亚洲怡红院男人天堂| 免费不卡的大黄色大毛片视频在线观看 | 18禁在线无遮挡免费观看视频| 嫩草影院新地址| 免费搜索国产男女视频| 亚洲va在线va天堂va国产| 日韩欧美精品免费久久| 精品欧美国产一区二区三| 中文字幕制服av| 午夜精品国产一区二区电影 | 99久久精品一区二区三区| 欧美97在线视频| 伦精品一区二区三区| 亚洲第一区二区三区不卡| 国产高潮美女av| 日韩av在线免费看完整版不卡| 大话2 男鬼变身卡| 亚洲自拍偷在线| 99在线人妻在线中文字幕| 色播亚洲综合网| 欧美xxxx性猛交bbbb| 最近最新中文字幕大全电影3| 免费av观看视频| 特级一级黄色大片| 三级男女做爰猛烈吃奶摸视频| 在线观看66精品国产| 国产精品一区二区性色av| 免费观看a级毛片全部| 国产精品一区二区三区四区久久| 一级毛片久久久久久久久女| 好男人视频免费观看在线| 男人和女人高潮做爰伦理| 亚洲国产精品成人久久小说| 亚洲精品456在线播放app| 中文字幕久久专区| 伦理电影大哥的女人| 日韩精品青青久久久久久| 亚洲18禁久久av| 99久久无色码亚洲精品果冻| 久久精品国产亚洲网站| 中文字幕制服av| 精品久久久久久久人妻蜜臀av| 看免费成人av毛片| 免费av不卡在线播放| 亚洲精品日韩在线中文字幕| 草草在线视频免费看| 九九热线精品视视频播放| 国产精品久久视频播放| 免费在线观看成人毛片| 欧美一级a爱片免费观看看| 欧美+日韩+精品| 91久久精品电影网| 久久久久久伊人网av| 国产高清有码在线观看视频| 久久精品夜色国产| 一级爰片在线观看| 人人妻人人澡人人爽人人夜夜 | 一级毛片aaaaaa免费看小| 在线观看66精品国产| 哪个播放器可以免费观看大片| a级毛片免费高清观看在线播放| 日日干狠狠操夜夜爽| 最近的中文字幕免费完整| 国产av一区在线观看免费| 亚洲欧美成人精品一区二区| 大话2 男鬼变身卡| 高清日韩中文字幕在线| 欧美精品国产亚洲| 国产在视频线精品| 欧美极品一区二区三区四区| 成人欧美大片| 亚洲国产高清在线一区二区三| 七月丁香在线播放| 久久6这里有精品| 韩国高清视频一区二区三区| 久久这里有精品视频免费| 国产精品一区二区在线观看99 | 久久鲁丝午夜福利片| 亚洲自拍偷在线| 国产精品日韩av在线免费观看| 久久久色成人| 亚洲在线观看片| 国内少妇人妻偷人精品xxx网站| 听说在线观看完整版免费高清| 亚洲av电影不卡..在线观看| av福利片在线观看| 国产亚洲午夜精品一区二区久久 | 你懂的网址亚洲精品在线观看 | 中文字幕精品亚洲无线码一区| 黄色欧美视频在线观看| 夫妻性生交免费视频一级片| 成人三级黄色视频| 国产高清国产精品国产三级 | 久久久久网色| 国产又黄又爽又无遮挡在线| 人人妻人人澡人人爽人人夜夜 | 久久人妻av系列| 99久久九九国产精品国产免费| 在线观看av片永久免费下载| 99热网站在线观看| 两个人视频免费观看高清| 国产视频内射| 在线播放无遮挡| 日韩av在线免费看完整版不卡| 亚洲国产欧洲综合997久久,| 禁无遮挡网站| 中文在线观看免费www的网站| 三级经典国产精品| 精品国产三级普通话版| 校园人妻丝袜中文字幕| 国产中年淑女户外野战色| 亚洲在线自拍视频| 午夜福利在线观看免费完整高清在| 亚洲综合色惰| 久久99热这里只频精品6学生 | 亚洲国产精品合色在线| 热99在线观看视频| 国语对白做爰xxxⅹ性视频网站| 一级av片app| 国产激情偷乱视频一区二区| 全区人妻精品视频| 热99在线观看视频| 搡老妇女老女人老熟妇| 建设人人有责人人尽责人人享有的 | 亚洲在线自拍视频| 高清午夜精品一区二区三区| 国产三级在线视频| 国产亚洲91精品色在线| 亚洲精品影视一区二区三区av| 韩国av在线不卡| 午夜久久久久精精品| 伊人久久精品亚洲午夜| 国产爱豆传媒在线观看| 99久久中文字幕三级久久日本| 日韩亚洲欧美综合| 久久久a久久爽久久v久久| 成年女人看的毛片在线观看| 最新中文字幕久久久久| 亚洲欧美成人综合另类久久久 | 级片在线观看| 亚洲欧美精品自产自拍| 亚洲精品成人久久久久久| 成人特级av手机在线观看| 韩国av在线不卡| 中文资源天堂在线| 只有这里有精品99| 精品免费久久久久久久清纯| 欧美极品一区二区三区四区| .国产精品久久| 99热网站在线观看| 国产亚洲av嫩草精品影院| 免费观看精品视频网站| 亚洲真实伦在线观看| 国产高清有码在线观看视频| 中文字幕制服av| 国产精品综合久久久久久久免费| 国产黄色小视频在线观看| 亚洲美女搞黄在线观看| 永久网站在线| 噜噜噜噜噜久久久久久91| 亚洲国产精品久久男人天堂| 我的老师免费观看完整版| 狂野欧美激情性xxxx在线观看| 亚洲国产精品久久男人天堂| 久久精品人妻少妇| av在线播放精品| 黄色欧美视频在线观看| 亚洲自拍偷在线| 久久久久网色| 中文资源天堂在线| 国产一区二区三区av在线| 狠狠狠狠99中文字幕| 亚洲国产成人一精品久久久| 成人高潮视频无遮挡免费网站| 少妇人妻一区二区三区视频| 综合色丁香网| 午夜福利成人在线免费观看| 日韩,欧美,国产一区二区三区 | 男人的好看免费观看在线视频| 国产在视频线在精品| 欧美日韩综合久久久久久| 精品一区二区免费观看| 国产精品久久视频播放| 在线播放国产精品三级| av天堂中文字幕网| 国产成人免费观看mmmm| 亚洲va在线va天堂va国产| 国产综合懂色| 免费av不卡在线播放| 99热网站在线观看| 日韩高清综合在线| 观看免费一级毛片| 亚洲第一区二区三区不卡| 五月玫瑰六月丁香| 男插女下体视频免费在线播放| 国产一区有黄有色的免费视频 | 欧美极品一区二区三区四区| 黄色配什么色好看| 国内揄拍国产精品人妻在线| 好男人视频免费观看在线| 亚洲国产成人一精品久久久| 欧美不卡视频在线免费观看| 亚洲欧洲国产日韩| av国产久精品久网站免费入址| 老女人水多毛片| 我的老师免费观看完整版| 色哟哟·www| 三级毛片av免费| 赤兔流量卡办理| 只有这里有精品99| 嫩草影院新地址| 精品99又大又爽又粗少妇毛片| 亚洲经典国产精华液单| 七月丁香在线播放| 久久久色成人| 国产成人午夜福利电影在线观看| 免费看a级黄色片| 免费观看性生交大片5| 内地一区二区视频在线| 国产老妇女一区| 国产亚洲91精品色在线| 最近的中文字幕免费完整| 国产成人一区二区在线| 国产 一区精品| 六月丁香七月| 国产一区有黄有色的免费视频 | 色尼玛亚洲综合影院| 欧美三级亚洲精品| 亚洲国产欧美人成| 最近2019中文字幕mv第一页| av在线亚洲专区| 亚洲精品自拍成人| 日本爱情动作片www.在线观看| 高清毛片免费看| 国内精品美女久久久久久| 色综合站精品国产| 国产一级毛片在线| 精品国产露脸久久av麻豆 | 日韩成人伦理影院| 少妇被粗大猛烈的视频| 最后的刺客免费高清国语| 九九爱精品视频在线观看| 天堂中文最新版在线下载 | 男人的好看免费观看在线视频| 级片在线观看| 91久久精品电影网| 免费大片18禁| 亚洲不卡免费看| 亚洲av日韩在线播放| 国产精品女同一区二区软件| 亚洲国产精品久久男人天堂| 精品一区二区三区人妻视频| 听说在线观看完整版免费高清| 国产精品三级大全| 麻豆av噜噜一区二区三区| 亚洲成色77777| 免费不卡的大黄色大毛片视频在线观看 | 国产午夜精品久久久久久一区二区三区| 最近中文字幕高清免费大全6| 又爽又黄a免费视频| 成人毛片a级毛片在线播放| 综合色丁香网| 国产综合懂色| 日产精品乱码卡一卡2卡三| 男女视频在线观看网站免费| 18禁动态无遮挡网站| 日本-黄色视频高清免费观看| 国国产精品蜜臀av免费| 晚上一个人看的免费电影| 在线天堂最新版资源| 色哟哟·www| 中文字幕制服av| 三级国产精品欧美在线观看| 国产成人freesex在线| 人人妻人人看人人澡| 亚洲精品成人久久久久久| 国产成人a∨麻豆精品| 小蜜桃在线观看免费完整版高清| 欧美bdsm另类| 一级黄片播放器| 精品酒店卫生间| 高清av免费在线| 国产一级毛片在线| 国产亚洲5aaaaa淫片| 精品人妻熟女av久视频| 夫妻性生交免费视频一级片| 久久99热这里只频精品6学生 | 欧美激情久久久久久爽电影| 久久精品综合一区二区三区| 亚洲在线观看片| 成人三级黄色视频| 亚洲国产色片| 人体艺术视频欧美日本| 视频中文字幕在线观看| 联通29元200g的流量卡| 久久久成人免费电影| 啦啦啦啦在线视频资源| 亚洲av日韩在线播放| 国产精品久久久久久av不卡| 国产精品电影一区二区三区| 亚洲国产精品专区欧美| 嘟嘟电影网在线观看| 一区二区三区高清视频在线| 99久久无色码亚洲精品果冻| 人人妻人人看人人澡| 国产欧美另类精品又又久久亚洲欧美| 国产单亲对白刺激| 久久热精品热| 一区二区三区免费毛片| 色5月婷婷丁香| 久久精品影院6| 又爽又黄a免费视频| 久久久午夜欧美精品| 国产精品久久视频播放| 亚洲性久久影院| 久久精品影院6| 国产在线一区二区三区精 | 久久精品综合一区二区三区| 99热这里只有是精品50| 只有这里有精品99| 少妇人妻精品综合一区二区| 在线a可以看的网站| 国产精品女同一区二区软件| 久久精品熟女亚洲av麻豆精品 | 人妻制服诱惑在线中文字幕| 淫秽高清视频在线观看| a级一级毛片免费在线观看| 嫩草影院精品99| 欧美高清成人免费视频www| 床上黄色一级片| av专区在线播放| 国产一区二区在线av高清观看| 国产色爽女视频免费观看| 免费播放大片免费观看视频在线观看 | 色播亚洲综合网| 内射极品少妇av片p| 成年女人永久免费观看视频| 日韩欧美国产在线观看| 国产伦精品一区二区三区视频9| 国产一级毛片在线| 又粗又爽又猛毛片免费看| 国产亚洲精品久久久com| 最近中文字幕2019免费版| 日韩av在线大香蕉| 久久久久久久久久成人| 亚洲精品国产成人久久av| 国产淫语在线视频| 亚洲av男天堂| 色尼玛亚洲综合影院| 欧美日韩精品成人综合77777| 久久久久网色| 欧美激情国产日韩精品一区| 欧美一区二区国产精品久久精品| 国产大屁股一区二区在线视频| 亚洲美女视频黄频| 亚洲久久久久久中文字幕| av免费观看日本| 国产精品久久电影中文字幕| 91av网一区二区| 男人狂女人下面高潮的视频| 国产精品无大码| 国产亚洲一区二区精品| 午夜福利在线在线| 国产黄色视频一区二区在线观看 | 国产亚洲av嫩草精品影院| 精品国产一区二区三区久久久樱花 | 两性午夜刺激爽爽歪歪视频在线观看| 视频中文字幕在线观看| 色5月婷婷丁香| 亚洲最大成人av| 波多野结衣巨乳人妻| 床上黄色一级片| 欧美97在线视频| 亚洲av男天堂| 久久欧美精品欧美久久欧美| 水蜜桃什么品种好| 国内少妇人妻偷人精品xxx网站| 欧美色视频一区免费| 在线天堂最新版资源| 丰满少妇做爰视频| 欧美最新免费一区二区三区| 久久久久九九精品影院| 精品国产露脸久久av麻豆 | 亚洲中文字幕一区二区三区有码在线看| 两个人视频免费观看高清| 亚洲欧美成人精品一区二区| 亚洲18禁久久av| 久久久a久久爽久久v久久| 亚洲伊人久久精品综合 | 99久久精品一区二区三区| 欧美性猛交╳xxx乱大交人| 少妇熟女aⅴ在线视频| 日韩成人av中文字幕在线观看| 亚洲在久久综合| 亚洲欧美日韩卡通动漫| 中国美白少妇内射xxxbb| 高清av免费在线| 国产淫语在线视频| 床上黄色一级片| 国产精品不卡视频一区二区| 我要看日韩黄色一级片| 日韩成人伦理影院| 一个人免费在线观看电影| 熟妇人妻久久中文字幕3abv| 乱人视频在线观看| 欧美+日韩+精品| 狠狠狠狠99中文字幕| 国产毛片a区久久久久| 久久精品久久精品一区二区三区| 久久久久国产网址| 成人午夜高清在线视频| 免费无遮挡裸体视频| 丰满少妇做爰视频| 国产午夜福利久久久久久| 国产老妇伦熟女老妇高清| 天天躁夜夜躁狠狠久久av| 建设人人有责人人尽责人人享有的 | 国产乱人偷精品视频| 久久人人爽人人爽人人片va| 少妇的逼水好多| 久久久久久久午夜电影| 日韩在线高清观看一区二区三区| 亚洲av一区综合| 久久久久国产网址|