• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    BHGSO:Binary Hunger Games Search Optimization Algorithm for Feature Selection Problem

    2022-11-09 08:14:18ManjulaDeviPremkumarPradeepJangirSanthoshKumarDalalAlrowailiandKottakkaranSooppyNisar
    Computers Materials&Continua 2022年1期

    R.Manjula Devi,M.Premkumar,Pradeep Jangir,B.Santhosh Kumar,Dalal Alrowaili and Kottakkaran Sooppy Nisar

    1Department of Computer Science and Engineering,Kongu Engineering College,Perundurai,638060,Tamil Nadu,India

    2Department of Electrical and Electronics Engineering,Dayananda Sagar College of Engineering,Bengaluru,560078,Karnataka,India

    3Rajasthan Rajya Vidyut Prasaran Nigam,Sikar,332025,Rajasthan,India

    4Department of Computer Science and Engineering,Guru Nanak Institute of Technology,Hyderabad,501506,Telangana,India

    5Mathematics Department,College of Science,Jouf University,Sakaka,P.O.Box:2014,Saudi Arabia

    6Department of Mathematics,College of Arts and Sciences,Prince Sattam bin Abdulaziz University,Wadi Aldawaser,11991,Saudi Arabia

    Abstract:In machine learning and data mining,feature selection(FS)is a traditional and complicated optimization problem.Since the run time increases exponentially,FS is treated as an NP-hard problem.The researcher’s effort to build a new FS solution was inspired by the ongoing need for an efficient FS framework and the success rates of swarming outcomes in different optimization scenarios.This paper presents two binary variants of a Hunger Games Search Optimization(HGSO) algorithm based on V-and S-shaped transfer functions within a wrapper FS model for choosing the best features from a large dataset.The proposed technique transforms the continuous HGSO into a binary variant using V-and S-shaped transfer functions (BHGSO-V and BHGSO-S).To validate the accuracy,16 famous UCI datasets are considered and compared with different state-of-the-art metaheuristic binary algorithms.The findings demonstrate that BHGSO-V achieves better performance in terms of the selected number of features,classification accuracy,run time,and fitness values than other state-of-the-art algorithms.The results demonstrate that the BHGSO-V algorithm can reduce dimensionality and choose the most helpful features for classification problems.The proposed BHGSO-V achieves 95%average classification accuracy for most of the datasets,and run time is less than 5 sec.for low and medium dimensional datasets and less than 10 sec for high dimensional datasets.

    Keywords: Binary optimization;feature selection;machine learning;hunger games search optimization

    1 Introduction

    Due to the significant advancement of technology,including the Internet in various areas,many databases were recently developed,and the complexity and diversity have also grown.Nevertheless,high-dimensional databases have some drawbacks,including lengthy model building periods,incomplete features,and deteriorated efficiency,making data analysis challenging [1,2].Feature selection (FS) is a robust data pre-processing phase that can limit the features count and dataset dimensions,improve model generalization,and reduce overfitting.The process of deciding the attributes to be used in a classification problem is known as FS [3].FS aims to select a set of attributes to improve accuracy rate or reduce the size of the framework while substantially reducing the classifier’s accuracy rate using only the FS process.For machine learning experts who must handle complicated data,FS is essential.It is frequently used in machine learning,data mining,and pattern recognition [3-5].FS methods have previously been seen in video classification,image retrieval,gender classification,vehicle identification,and other applications [6,7].By removing unwanted or noisy features,FS may reduce the dimensions of the given dataset.This decreases the complexity and data processing burden and lowers the computation complexity of the classification model,and saves resources while improving the algorithm’s efficiency.The FS approach helps to reduce the dimensionality of real-world problems.Investigators have produced a series of FS methods up to date.FS approaches may be classified as filter or wrapper approaches depending on whether an assessment tool is accessible [8,9].Filter-based approaches pick the dataset’s attributes first,then train the learner.The fundamental premise is to assign a weight to each element,with the weight representing the dimension feature’s value,and then ranking the attributes based on the weight.Wrapper-based approaches use the classification model (for example,classification) as the function subset’s performance measure explicitly.The key concept is to approach subset choice as a search optimization problem,generating multiple types,evaluating them,and comparing them to other configurations.The wrapper outperforms the filter in terms of learning classification accuracy but at a high computational complexity.

    Conventional optimization approaches are incapable of solving complex optimization tasks,and obtaining acceptable results is challenging.As a result,a more efficient approach,known as the metaheuristic algorithm,was already suggested and used by many researchers.Optimization algorithms have several benefits,including their ease of use,independence from the problem,versatility,and gradient-free design [10].Evolutionary algorithms,human-based and physicsbased,and swarm intelligence (SI) approaches are the four types of metaheuristic algorithms.A genetic algorithm (GA) is an example of an evolutionary algorithm,in which three steps,such as crossover,selection,and mutation,are utilized to update an individual and attain a global solution.Differential evolution (DE) algorithm is also included in this category.The social behavior of animals influenced swarm intelligence,which shares all personal knowledge during the optimization procedure.Ant colony optimization,Dragonfly Algorithm,Grey wolf optimizer(GWO),Salp Swarm Algorithm (SSA),particle swarm optimization,artificial bee colony,flower pollination algorithm,Marine Predator Algorithm (MPA),and satin bowerbird optimizer are few examples of SI algorithms.The sailfish optimizer,emperor penguin optimizer,whale optimization algorithm (WOA),harris hawk’s optimization (HHO),and Artificial Butterfly Optimization Algorithm [ABOA] are few recent algorithms in the SI category.Various researchers have utilized the SI algorithms due to their versatility and simplicity.Laws of physics inspire physics-based algorithms.For instance,gravitational search algorithm,simulated annealing,Sine-Cosine Algorithm (SCA),Atom Search Optimization (ASO) algorithm,Equilibrium Optimizer (EO),multi-verse optimizer,and Henry gas solubility algorithm (HGSA) belongs to this category.Human behavior and interaction in society inspire Human-based techniques.For instance,teaching-learning-based optimization,cultural evolution algorithm,and the volleyball premier league [11-16].The advantage of using such techniques to solve real-world problems is that they deliver the best outcomes in a short period,even for huge problem sets [17-19].To address the FS problem,all the above-said algorithms and their binary variants have been used.

    The authors of [19] suggested a binary GWO (BGWO) algorithm hybridized with a twostage mutation scheme for the FS problem by utilizing the sigmoid function to map the basic version to the binary version.The authors of [20] presented a binary SSA accompanied by crossover for the feature selection problem using various transfer functions to alter the SSA into its binary version.To overcome the FS challenge,the authors of [21] proposed a binary variant of ABOA by using two transfer functions: V-shaped and S-shaped.The authors of [22] presented binary DA (BDA) for FS based on eight transfer functions.The benefit is that the likelihood of changing the location of an individual during the initial stages is strong,making it easier to discover different solutions from the initial population.The authors of [23] presented an improved BDA version by combining a hyper learning scheme called HLBDA.The sigmoid function is utilized to convert the algorithm into binary search space.The authors of [24] proposed a binary symbiotic organism search algorithm based on the wrapper technique that uses a time-varying mechanism as the FS process.To address the FS problem,the authors of [25] utilized HGSA,in which k-nearest neighbor (kNN) and Support Vector Machine (SVM) were used to validate the selected features.The authors of [26] presented a binary SCA (BSCA) for the FS problem using eight transfer functions to alter into its binary version.The authors of [27] presented a binary ASO (BASO) for the wrapper FS problem using eight transfer functions to alter into its binary version.The authors of [21] suggested binary EO (BEO) and its improved versions for the wrapper-based FS problems using the Sigmoid function.The authors of [28] suggested binary MPA (BMPA) and its improved versions for wrapper-based FS problems.The continuous version of MPA is converted into binary variants using Sigmoid and eight transfer functions.Often seen as research approaches for wrappers-FS problems are metaheuristic techniques such as the simulated annealing,GA,DE,ant lion optimization algorithm,harmony search algorithm,and particle swarm optimization algorithm.Please see the literature [29] for more information on metaheuristics for the FS problem.The authors of [30,31] introduced quadratic HHO and enhanced WOA for high dimensional feature election problems.Can conventional systems be helpful to address the FS problem? As discussed earlier,metaheuristics have many advantages.The free lunch theorem (NFL) is the solution to this situation,stating that no algorithm can address all classification problems [32].For FS problems,one algorithm’s final output for a database would be excellent,while another algorithm’s success may be impaired.

    Therefore,in this paper,a binary variant of a recently proposed Hunger Games Search Optimization (HGSO) algorithm [33] is proposed to handle the feature selection problem.The suggested HGSO is based on animals’hunger-driven and social behaviors.This dynamic,strength and conditioning search approach uses the basic principle of “Hunger” as one of the essential homeostatic incentives and explanations for behavior,actions,and decisions to allow the optimization technique to be more intuitive and straightforward for the researcher’s existence of all species.The following are the reasons for using the HGSO techniques to optimize the FS problem in this study.Initially,the two segments demonstrate that the metaheuristic procedure outperforms other state-of-the-art techniques in solving these problems.As a result,we’d like to put the latest HGSO algorithm on the examination.Second,the HGSO method is a brand-new metaheuristic algorithm that has yet to be implemented to FS problems effectively.Finally,an evaluation of the proposed algorithm with sophisticated,recent,and high-efficient algorithms indicates that the proposed HGSO algorithm possesses the optimal or suboptimal solution with typically higher classification efficiency for problems studied (i.e.,fewer iterations or less runtime).This paper suggests two new discrete HGSO versions,called BHGSO-V and BHGSO-S,to make the FS problem easier to handle.The HGSO algorithm uses V-and S-shaped transfer functions to convert a continuous HGSO algorithm to a binary form.To validate the number of selected features,the kNN algorithm is used in this paper.The following are the highlights of the paper.

    · A new binary variant of the HGSO algorithm is formulated using different transfer functions.

    · BHGSO-V and BHGSO-S algorithms are applied to low,medium,and high dimensional FS problems.

    · The performance of the BHGSO algorithm is compared with other state-of-the-art algorithms.

    · Statistical tests,such as Friedman’s test and Wilcoxon Signed Rank test,have been conducted.

    The structure of the paper is organized as follows.Section 2 explains the basic concepts of the HGSO algorithm.Section 3 explains how the continuous HGSO algorithm is altered to a binary version using V-and S-shaped transfer functions.Section 4 discusses the results and further discussion while validating the performance of the proposed algorithm using 16 UCI datasets.Section 5 concluded the paper.

    2 Hunger Games Search Optimization(HGSO)Algorithm

    The Hunger Games Search Optimization (HGSO) algorithm was introduced by Yang et al.[33] in 2021 for continuous optimization problems.The HGSO algorithm is inspired by the everyday actions of animals,such as nervousness of being slain by predators and hunger.This section of the paper explains the mathematical modeling of the HGSO algorithm.The modeling is based on social choice and hunger-driven actions.

    2.1 Approach Food

    The approaching behavior of hunger is mathematically modeled in this subsection.The game instructions are presented in Eq.(1),which describes the foraging hunger and individual supportive communication actions.The mathematical expression given in Eq.(1) imitates the contraction mode.

    whereF(i)denotes the cost function value of each population,i∈1,2,...,n,BFdenotes the best cost function value obtained during the current iteration,and Sech denotes hyperbolic function and is equal toThe expression foris given in Eq.(3).

    whereMax_iterdenotes the maximum number of iterations andranddenotes a random number between [0,1].

    2.2 Hunger Role

    The hunger behavior of all individuals during the search is mathematically modeled in this subsection.The expression foris presented in Eq.(5).

    whereNdenotes the population size,hungrydenotes the starvation of each population,SHungrydenotes the sum of starving feelings of all populations,i.e.,sum(hungry),andr3,r4andr5denote random numbers between [0,1].The starvation of each population is mathematically modeled in Eq.(7).

    whereAllFitness(i)denotes the cost function value of each population in the current iteration.A new starvationHis supplementary based on the actual starvation.The expression forHis denoted in Eq.(8).

    whereHis restricted to a lower bound,LH,rdenotes a random number between [0,1],WFandBFdenote the worst fitness and best fitness attained during the current iteration,respectively,F(i)denotes the fitness of each population,r6represents a random number in the range of [0,1],andLBandUBdenote the lower and upper limits of the dimensions,respectively.The pseudocode and flowchart of the HGSO algorithm are displayed in Algorithm 1 and Fig.1,respectively.

    Algorithm 1: Pseudocode of HGSO Algorithm Initialize the variables,such as N,Max_iter,l,D,and SHungry Initialize the individuals’positions Xi (i=1,2,...,N)While (t ≤Max_iter)Find the cost function value of all populations Update BF,WF,Xb,and BI Find the Hungry,W1,and W2 using Eqs.(7),(5) and (6),respectively For each population Find E by Eq.(2)Update R and positions by Eqs.(3) and (1)End For t=t + 1 End While Return BF and Xb.

    Figure 1:Flowchart of the HGSO algorithm

    3 Proposed Binary Hunger Games Search Optimization(BHGSO)Algorithm

    The HGSO algorithm is a recently developed population-based algorithm that imitates hunger hunting behavior for food.In terms of avoidance of local optima,exploitation,exploration,and convergence,the HGSO algorithm outperforms the other population-based algorithms.It is proved by the inventors of the HGSO algorithm that the HGSO performs better on benchmark functions.Due to a better balance between the exploration and exploitation phases of the HGSO algorithm,the convergence and solution diversity is better in the HGSO algorithm.Therefore,the benefits mentioned above motivated the researchers to use the HGSO algorithm in real-world applications,including wrapper-based FS problems,due to its appealing properties.

    In the wrapper-based FS process,the classification model is used for training and validation at each phase,and then a sophisticated optimization technique is used to reduce the number of iterations.Besides,the search space is likely to be highly nonlinear,with numerous local optima.Generally,continuous optimization techniques predict feature combinations that optimize classification efficiency,and populations are used in the search space with d-dimension at positions[0,1].In contrast,binary versions are supposed to perform well if used similarly since the search space is restricted to two values for every dimension (0,1).Furthermore,binary operators are easier to understand than continuous operators [34].The primary reason for creating a binary representation of the HGSO algorithm is that the solutions to FS problems are restricted to binary values of 1 and 0.For the FS problem,a new binary HGSO (BHGSO) algorithm is suggested in this study.It can be shown that each individual can change its location using either the local or global search stages.To do so,a transfer function (TF) must be used to assign the hungry values to the probabilities of starvations so that the position can be updated.In other terms,a TF describes the likelihood of changing the hunger location vector between 0 to 1 or 1 to 0.Essentially,the binary search space is a hypercube,with the populations of the HGSO algorithm only being able to shift to the hypercube’s far edges by swapping different bit numbers.The proposed binary versions of the HGSO algorithm work based on updating the hunger location with the likelihood of its food.To accomplish this,a TF must be used to map the values to probability to update the hunger positions.To put it another way,a TF determines the probability of changing the vector position from 0 to 1 and 1 to 0.Two TFs,S-and V-shaped TFs,are used to create two binary variants,BHGSO-S and BHGSO-V,respectively.

    3.1 S-Shaped Binary Hunger Games Search Optimization(BHGSO-S)Algorithm

    As discussed,the new positions of the hunger found through local or global search can have continuous output,but these continuous positions should be converted into binary.This transformation is accomplished by changing continuous positions of each dimension utilizing a Sigmoidal (S-shaped) TF,which directs the hunger to travel in a binary location [35].As shown in Eq.(10) and Fig.2a,the S-shaped is a typical TF.

    whereFkidenotes the continuous value ofithhunger in thekthdimension at the current iterationt.The S-shaped TF’s value still seems to be continuous,so it must be used as the threshold for reaching the binary values.The S-shape TF transforms an infinite input to a finite output in a stable way.It is worth noting that as the TF’s trajectory increases,the likelihood of calculating the position vectors increases.In a sigmoidal equation,the widely used stochastic threshold is used to achieve the binary value,as shown in Eq.(1l).

    Figure 2:Characteristic curves;(a) S-shaped TF,(b) V-shaped TF

    3.2 V-Shaped Binary Hunger Games Search Optimization(BHGSO-V)Algorithm

    Rather than an S-shaped TF,a V-shaped TF approach is defined in this paper,and Eqs.(12)and (13) are used to accomplish this process [36,37].Fig.2b shows the steps involved in using the proposed TF to force hunger to direct in a binary position.

    Eq.(12) can be rewritten as follows.

    The threshold directions can be mathematically denoted in Eq.(14).

    Eq.(12) is used as the TF in this binary method to convert the position of hunger to the probabilities of adjusting the components of the location vectors.As a result,the location vectors of hunger are modified using the rules of Eq.(14).The V-Shaped TF has the advantage of not forcing hunger to take a value of 0 or 1.To put it another way,it allows hunger to turn to the compliments only after the fitness values are high;then,the hunger would remain in the current location due to their low fitness value.

    3.3 Binary Hunger Games Search Optimization Algorithm for FS Problem

    FS is a binary optimization process in which the populations can only choose between[1 or 0].A one-dimensional vector represents any solution;the length is determined by the number of attributes (nf) in the database.Any cell may have one of two values,either 0 or 1,in which 1 specifies that the respective feature is chosen and 0 specifies that the feature is not chosen.

    In general,the FS problem is a multi-objective optimization problem in which two conflicting objectives,such as choosing the smallestnfwhile maintaining the highest classification accuracy.The proposed binary optimization approaches are used to solve this multi-objective FS problem.In an FS problem,the lessnfand the highest classification accuracy are considered the best.The fitness function evaluates each solution using the KNN classifier to measure the accuracy rate and thenfchosen.The fitness function in Eq.(15) is used to test the strategies,the foreordained goal of finding a balance between thenfand classification accuracy.

    whereα∈[0,1],β=(1-α),γr(D)denotes the rate of a classification error of the KNN,|R|denotes the cardinality of thenf,and |N|denotes the total number of attributes in the database.Theαandβdenote two variables concerning the classification superiority and subset length.The overall flowchart of the BHGSO algorithm for FS problems is shown in Fig.3.

    Figure 3:Overall flowchart of BHGSO algorithm for the FS problems

    4 Results and Discussion

    4.1 Dataset Explanation

    Sixteen benchmark datasets from the UCI data source were selected for experimentation to verify the output of the proposed binary approaches.Tab.1 shows the identified datasets,including the number of instances and features in each dataset.The motive behind choosing such datasets is that they include a range of instances and features that reflect various issues that the proposed binary strategies would be evaluated on.A collection of high-dimensional datasets is also chosen to evaluate the efficiency of the suggested technique in high search spaces.Each dataset is split into cross-validation groups for assessment purposes.InK-fold cross-validation,the dataset is split into several folds,withK-1 folds used for training and the remaining set used for testing.This operation is repeated forMtimes.As a result,for each dataset,each algorithm is evaluatedK×Mtimes.The training portion is used to train the classifier during the optimization,while the testing portion is used to test the classifier’s output while it is being optimized.The validation portion is used to evaluate the attribute set chosen by the trained classifier.

    Table 1:Datasets selected for this study

    Based on the KNN classifier,a wrapper-based method for FS has been used in this paper,with the best option (k=5) being used on all datasets.Each population represents a single set of features during the training phase.The training fold is used to measure the efficiency of the KNN classifier on the validation fold during the optimization phase to direct the optimal feature subclass selection criteria.In contrast,the test fold is kept private from the process and is only used for ending assessment purposes.The suggested FS approaches are compared to some state-of-the-art approaches,including BEO,BMPA,BASO,BGWO,HLBDA,and BSCA.Tab.2 details the parameter settings of all algorithms.The proposed and other selected algorithms are implemented in MATLAB,which runs on Intel Core?i5-4210U CPU @2.40 GHz with Windows 8 operating system.All algorithms run 30 times for fair comparison on each dataset.

    4.2 Evaluation Criteria

    In each run of all algorithms,the following procedures are applied to the datasets.

    Classification Accuracy:It is a metric that defines how accurate a classification model is for a given set of features,i.e.,the number of features accurately categorized,and it is measured as follows:

    Table 2:Parameter settings of all algorithms

    whereNsignifies the number of test setpoints,Msignifies the number of runs,Cidenotes the output label for datapointi,matchdenotes the comparator that returns 0 when two labels are not identical,and 1 when they are same,andLidenotes the reference label fori.

    Average Selected Features:It represents the average of thenfoverMtimes and is defined as follows.

    wheresize(g*)denotes the number of attributes chosen in the testing dataset.

    Statistical Mean:It shows the average solutions obtained by running each methodMtimes and can be expressed as a collection of follows.

    whereg*idenotes the optimal fitness in theithrun.

    Statistical Standard Deviation(STD):It describes the variety of the optimal fitness produced by running algorithmMtimes and can be expressed in the following way.

    Statistical Best: It describes the best fitness attained by running an algorithm overMtimes.

    Statistical Worst: It describes the worst fitness attained by running an algorithm overMtimes.

    Average Run Time(RT):It describes each algorithm’s real RT in seconds over diverse runs,and it is expressed as follows.

    Wilcoxon Rank-Sum Test(WSRT)and Friedman’s Rank Test(FRT):WSRT and FRT statistical tests are utilized to see whether the solutions of the suggested algorithm were statistically dissimilar from other techniques.In WSRT,all of the values are assumed to rank as a single group,and then the ranks of each group are introduced.The null premise states that the two data points come from a similar individual and that any variances in the rank sums are due to sampling error.This statistical test generates ap-value parameter used to compare the implication stages of two techniques.Similarly,FRT also helps in assigning the rank based on the best solutions obtained over theMruns of all algorithms.

    4.3 Simulation Results

    The convergence curves of proposed BHGSO-V and BHGSO-S algorithms and other selected algorithms on sixteen datasets are shown in Fig.4.It is discovered that the proposed BHGSOV and BHGSO-S algorithms can typically deliver excellent convergence behavior,outperforming other approaches in the analysis of optimal feature subsets.Out of two proposed algorithms,the BHGSO-V algorithm can converge faster to discover the best optimum in all high-dimensional datasets (Colon,Leukemia,TOX_171,COIL20,Lung,and ORL).The outcome strongly suggests the proposed BHGSO-V’s dominance in high-dimensional FS problems.The characteristic that the hunger maintains strengthened and finds the best solutions is one reason why BHGSO shows high convergence speed.

    The best fitness,mean fitness,worst fitness,and standard deviation (STD) of fitness obtained by all algorithms are presented in Tabs.3-6,respectively.On all 16 datasets,BHGSO-V perceived the optimum best fitness values,as shown in Tab.3.However,BHGSO-S obtained the best fitness only for five datasets out of 16 datasets.Next to BHGSO-V,BMPA,BEO,HLBDA,BGWO,BHGSO-S,BASO,and BSCA gives the best fitness values.Out of all algorithms,the proposed BHGSO-V has consistently shown a high level of commitment when dealing with FS tasks.According to Tab.4,in most cases (11 datasets),the proposed BHGSO-V algorithm responded to the optimum mean fitness value,followed by BEO (nine datasets),BMPA (six datasets),BHGSO-S(five datasets),BGWO (five datasets).As a result,the BHGSO-V algorithm can frequently find the optimal solution feature subset,resulting in acceptable outcomes.The V-shaped transfer function is mainly responsible for BHGSO-V’s excellent search capability in addressing the FS problem.The STD of the objective function value obtained by the BHGSO-V algorithm is less compared to all selected algorithms,followed by BMPA,BGWO,BEO,BHGSO-S,BASO,and BSCA,as shown in Tab.5.In comparison to all,the proposed BHGSO-V algorithms can provide highly reliable performance.Boldface in all tables indicates the best result among all algorithms.

    Figure 4:Convergence curves of all algorithms on all selected datasets

    The average accuracy and STD of the classification accuracy values obtained by all algorithms are listed in Tabs.6 and 7,respectively.As seen in Tab.6,the average accuracy obtained by BHGSO-V is better (i.e.,15 datasets),followed by BMPA,BEO,BHGSO-S,BGWO,HLBDA,BASO,and BSCA.As seen in Tab.7,the STD of the classification accuracy obtained by the proposed BHGSO-V algorithm is better (i.e.,12 datasets),which means the reliability of the algorithm is better related to other techniques.As shown,the BHGSO-V algorithm outperformed other approaches in obtaining the best feature subset in most datasets.The boxplot analysis of eight algorithms,on the other hand,is shown in Fig.5.In the present research,the proposed BHGSO algorithm gave the best median and mean values,as shown in Fig.5.The outcomes show that the BHGSO algorithm is effective in maintaining better classification accuracy.

    Table 3:Best fitness values of all algorithms

    Table 5:STD of fitness values of all algorithms

    Table 6:Average classification accuracy of all algorithms

    The mean and STD of the selection of the features from the large datasets are shown in Tabs.8 and 9,respectively.In sixteen datasets,the proposed BHGSO-V had the smallest feature size (12 datasets),followed by BMPA,according to the findings (nine datasets).Unlike BSCA,BASO,BGWO,HLDBA,BEO,and BMPA,the BHGSO algorithm can typically find a small subset of best features that better represent the target definition.The BHGSO algorithm can avoid local optima and efficiently interpret the best FS solution.The lowest values of STD obtained by BHGSO prove the reliability of the algorithm.

    Table 7:STD classification accuracy of all algorithms

    A few statistical techniques are used to assess the proposed BHGSO algorithm’s efficacy.Several statistical non-parametric tests are discussed in the literature.The statistical analysis in this study is separated into two parts.First,the Friedman rank test (FRT) is utilized to assess all algorithms’accuracy.The authors discovered that there is a substantial difference among all algorithms in this paper.The FRT of all algorithms is shown in Tab.10.Based on the score obtained by all algorithms,the ranking is provided to all algorithms.The proposed BHGSO algorithm stands first among all selected algorithms for all datasets.For pairwise comparisons,the Wilcoxon signed-rank test (WSRT) is used in this paper.If thep-value is larger than 0.05,the results of the two techniques are considered to be identical;otherwise,the two methods are significantly different.The WSRT of the BHGSO algorithm against all other methods is shown in Tab.11.According to the findings,the suggested BHGSO’s classification results were significantly higher than those of other competitors.Overall,the proposed HGSO algorithm provides the highest classification accuracy while also reducing dimensionality.Another performance metric called RT is illustrated in Fig.6.From Fig.6,it is observed that the RT of both versions of the BHGSO algorithm is significantly less for more than 10 datasets.

    4.4 Discussions

    The proposed BHGSO algorithm was determined to be the best FS method based on the findings.The BHGSO-V algorithm enhanced the global optimum detection over the complex databases in terms of convergence and a minimum feature selection.The proposed BHGSO algorithms can able to sustain a decent acceleration during the iterations.The results in Tabs.3-5 demonstrated the superiority of the BHGSO algorithms in terms of best,mean,and STD of the fitness values.The average accuracy and STD of the classification accuracy shown in Tabs.6 and 7 prove the performance of the algorithm in classifying the datasets.The selected features shown in Tabs.8 and 9 show the superiority of the BHGSO algorithms in selecting the optimal features.Furthermore,the BHGSO performed more consistently than the traditional binary algorithms,as evidenced by lower STD values.According to the experimental analysis results,the suggested BHGSO algorithm can often outperform the BEO,BMPA,BGWO,HLBDA,BASO,and BSCA classification.Clearly,the BHGSO features selected can typically provide high-quality data,which helped to improve prediction ability.The BHGSO effectively selected appropriate attributes and omitted most of the obsolete ones in a high-dimensional dataset like Colon,Leukemia,TOX_171,COIL20,Lung,and ORL.Compared to all selected algorithms,the suggested BHGSO is more capable of choosing significant features.

    Figure 5:Boxplot analysis of all algorithms on all selected datasets

    Table 8:Mean selected feature subsets of all algorithms

    Table 9:STD of the selected feature subsets of all algorithms

    Table 10:FRT of all algorithms on all datasets

    Table 11:WSRT of BHGSO-V against other selected algorithms on all datasets

    Figure 6:RT of all algorithms on all selected datasets

    5 Conclusion

    Binary versions of the HGSO algorithm are introduced and used to address the FS problems in wrapper form in this study.Either using S-shaped or V-shaped TFs,the continuous variant of the HGSO is converted to a binary variant.The proposed techniques can be used for FS in machine learning to evaluate various algorithms’searching abilities.The FS problem is expressed as a multiobjective problem with an objective function reflecting dimensionality reduction and classification accuracy.To evaluate the output,16 datasets from the UCI repository were selected.For evaluation,the suggested BHGSO are used in the FS problems,and the experimental outcomes have been compared to advanced FS methods such as BEO,BMPA,HLBDA,BGWO,BSCA,and BASO.To evaluate various aspects of results,the assessment uses a collection of evaluation criteria.On most datasets,experimental findings showed that the proposed BHGSO lead to better outcomes than other strategies.Furthermore,the findings,such as classification accuracy (>95%) and run time (<5 s for low and medium dimensional problems and<10 s for high dimensional problem),demonstrate that using a BHGSO with a V-shaped TF can expressively boost the performance of HGSO in terms of the number of features selected and classification accuracy.The experimental results reveal that the BHGSO-V searches the feature set more efficiently and converges to the best solution faster than other optimizations.The continuous HGSO was also successfully transformed into binary variants that can address several discrete problems,including the task scheduling,traveling salesman,and knapsack problems.

    Funding Statement: The authors received no specific funding for this study.

    Conflicts of Interest: The authors declare that they have no conflicts of interest to report regarding the present study.

    免费观看的影片在线观看| 我要看日韩黄色一级片| 婷婷色麻豆天堂久久| 日韩av在线免费看完整版不卡| 免费观看在线日韩| 我的老师免费观看完整版| 91久久精品电影网| 亚洲精品自拍成人| 男女下面进入的视频免费午夜| 99热国产这里只有精品6| 18禁裸乳无遮挡动漫免费视频| 在线观看一区二区三区激情| 日韩不卡一区二区三区视频在线| 日韩伦理黄色片| 大码成人一级视频| 午夜激情久久久久久久| a级毛片免费高清观看在线播放| 日韩一本色道免费dvd| 久久久久久久久久久免费av| 国产精品一区二区三区四区免费观看| 在线精品无人区一区二区三 | 高清午夜精品一区二区三区| 国产免费福利视频在线观看| 午夜福利在线在线| 99热国产这里只有精品6| 99久久中文字幕三级久久日本| xxx大片免费视频| 激情五月婷婷亚洲| 亚洲,一卡二卡三卡| 最近手机中文字幕大全| 麻豆乱淫一区二区| 久久久久久久国产电影| 联通29元200g的流量卡| 久久这里有精品视频免费| 日本黄大片高清| 亚洲怡红院男人天堂| 有码 亚洲区| 日韩中字成人| 久久精品夜色国产| 在现免费观看毛片| 特大巨黑吊av在线直播| 精品一品国产午夜福利视频| 中文字幕制服av| 啦啦啦啦在线视频资源| 国产69精品久久久久777片| 免费在线观看成人毛片| 少妇人妻精品综合一区二区| 久久精品国产亚洲网站| 色婷婷久久久亚洲欧美| 国产精品久久久久久精品电影小说 | 亚洲一区二区三区欧美精品| 欧美一级a爱片免费观看看| 国产片特级美女逼逼视频| 久久韩国三级中文字幕| 噜噜噜噜噜久久久久久91| 欧美精品国产亚洲| 国产亚洲5aaaaa淫片| 久久av网站| 黄色一级大片看看| 国产精品.久久久| 不卡视频在线观看欧美| 免费高清在线观看视频在线观看| 亚洲第一区二区三区不卡| 51国产日韩欧美| 国产精品国产三级国产av玫瑰| 夫妻性生交免费视频一级片| 国产精品人妻久久久久久| a级毛色黄片| 最近手机中文字幕大全| 五月玫瑰六月丁香| 男女免费视频国产| 一个人看的www免费观看视频| av不卡在线播放| 久久人人爽人人爽人人片va| 你懂的网址亚洲精品在线观看| 五月开心婷婷网| 麻豆成人av视频| 色5月婷婷丁香| 在线亚洲精品国产二区图片欧美 | 免费播放大片免费观看视频在线观看| 国产av精品麻豆| 国产又色又爽无遮挡免| 久久久精品免费免费高清| 久久久久性生活片| 在线观看人妻少妇| 国产精品久久久久成人av| 哪个播放器可以免费观看大片| 亚洲怡红院男人天堂| 日韩av免费高清视频| 一本色道久久久久久精品综合| 国产精品免费大片| 国产亚洲最大av| 国产精品一区二区在线不卡| 一级二级三级毛片免费看| 91在线精品国自产拍蜜月| 亚洲国产色片| 18禁动态无遮挡网站| 亚洲av中文字字幕乱码综合| 三级经典国产精品| 日韩av在线免费看完整版不卡| 亚洲怡红院男人天堂| 一级二级三级毛片免费看| 国产精品久久久久成人av| 晚上一个人看的免费电影| 国产精品麻豆人妻色哟哟久久| 九草在线视频观看| 2022亚洲国产成人精品| 国产男女内射视频| 中文字幕精品免费在线观看视频 | 婷婷色麻豆天堂久久| 国产亚洲91精品色在线| 亚洲va在线va天堂va国产| 久久久久久久大尺度免费视频| 国产色婷婷99| 一级毛片黄色毛片免费观看视频| 婷婷色麻豆天堂久久| 日韩大片免费观看网站| 欧美zozozo另类| 如何舔出高潮| 麻豆乱淫一区二区| 午夜激情福利司机影院| 欧美国产精品一级二级三级 | 插阴视频在线观看视频| 国产淫片久久久久久久久| 晚上一个人看的免费电影| 伦精品一区二区三区| 国产亚洲5aaaaa淫片| 国产男人的电影天堂91| 国产成人a∨麻豆精品| 欧美xxxx性猛交bbbb| 亚洲人成网站高清观看| 国产高清不卡午夜福利| 日本wwww免费看| 好男人视频免费观看在线| 亚洲图色成人| av福利片在线观看| 国产一区二区三区av在线| 亚洲精品乱久久久久久| 国产免费视频播放在线视频| 99热6这里只有精品| 亚洲精品一区蜜桃| 久久精品国产亚洲av涩爱| 22中文网久久字幕| 2018国产大陆天天弄谢| 久久女婷五月综合色啪小说| 中文天堂在线官网| 日本vs欧美在线观看视频 | 国内揄拍国产精品人妻在线| 亚洲av电影在线观看一区二区三区| 日本一二三区视频观看| 免费黄网站久久成人精品| 欧美+日韩+精品| 亚洲激情五月婷婷啪啪| 男女边吃奶边做爰视频| 国产人妻一区二区三区在| 亚州av有码| 激情五月婷婷亚洲| 久久 成人 亚洲| 久久久久性生活片| 亚洲av在线观看美女高潮| 中国美白少妇内射xxxbb| 两个人的视频大全免费| 国产成人免费观看mmmm| 啦啦啦视频在线资源免费观看| 日本黄色片子视频| 这个男人来自地球电影免费观看 | 春色校园在线视频观看| 久久精品国产鲁丝片午夜精品| 网址你懂的国产日韩在线| 99热国产这里只有精品6| av国产精品久久久久影院| 免费看日本二区| 尾随美女入室| 永久免费av网站大全| 中文精品一卡2卡3卡4更新| 美女福利国产在线 | 国产欧美亚洲国产| 交换朋友夫妻互换小说| 免费不卡的大黄色大毛片视频在线观看| 97热精品久久久久久| 热re99久久精品国产66热6| 欧美高清成人免费视频www| 天天躁夜夜躁狠狠久久av| 777米奇影视久久| 日韩亚洲欧美综合| 国产日韩欧美亚洲二区| 国产熟女欧美一区二区| 日本wwww免费看| 日日摸夜夜添夜夜爱| 嘟嘟电影网在线观看| 欧美人与善性xxx| 国产午夜精品久久久久久一区二区三区| 国产精品欧美亚洲77777| 精品一区二区三区视频在线| 在线观看免费视频网站a站| av网站免费在线观看视频| 免费久久久久久久精品成人欧美视频 | 又大又黄又爽视频免费| 亚洲,一卡二卡三卡| 99久久人妻综合| 欧美最新免费一区二区三区| 亚洲成人中文字幕在线播放| 天堂8中文在线网| 成年免费大片在线观看| 91精品一卡2卡3卡4卡| 中文字幕av成人在线电影| 国产视频首页在线观看| 国产美女午夜福利| 80岁老熟妇乱子伦牲交| 男人舔奶头视频| 欧美国产精品一级二级三级 | 婷婷色av中文字幕| 亚洲久久久国产精品| 深夜a级毛片| 精品少妇久久久久久888优播| 亚洲成人一二三区av| 国产一区二区三区综合在线观看 | 一级毛片 在线播放| 在线观看免费高清a一片| 亚洲av综合色区一区| 久久鲁丝午夜福利片| 久久久午夜欧美精品| 美女xxoo啪啪120秒动态图| 婷婷色av中文字幕| 黄色一级大片看看| 少妇人妻 视频| 亚洲高清免费不卡视频| 九九爱精品视频在线观看| 一个人看视频在线观看www免费| 欧美日韩视频精品一区| 日本爱情动作片www.在线观看| 嘟嘟电影网在线观看| 波野结衣二区三区在线| 亚洲av男天堂| 插阴视频在线观看视频| 伦精品一区二区三区| 国产欧美另类精品又又久久亚洲欧美| 国产精品久久久久久久久免| 日韩av不卡免费在线播放| 国模一区二区三区四区视频| 99国产精品免费福利视频| av又黄又爽大尺度在线免费看| 欧美成人精品欧美一级黄| 国产高潮美女av| 免费观看无遮挡的男女| 97超碰精品成人国产| 十分钟在线观看高清视频www | 久久久国产一区二区| 久久久久国产网址| 日日啪夜夜撸| 久久精品人妻少妇| 日韩精品有码人妻一区| 免费不卡的大黄色大毛片视频在线观看| 午夜激情久久久久久久| 欧美少妇被猛烈插入视频| 国产高清有码在线观看视频| 亚洲色图综合在线观看| 欧美成人一区二区免费高清观看| 熟女人妻精品中文字幕| 99久久人妻综合| 美女脱内裤让男人舔精品视频| 看非洲黑人一级黄片| 午夜福利视频精品| 一个人免费看片子| 国产免费福利视频在线观看| 2018国产大陆天天弄谢| 国产乱来视频区| 网址你懂的国产日韩在线| 国产精品一及| 我要看黄色一级片免费的| 观看美女的网站| 国产免费视频播放在线视频| 久久久久久久亚洲中文字幕| 国产熟女欧美一区二区| 亚洲精品国产成人久久av| 国产一级毛片在线| 免费观看在线日韩| 狂野欧美激情性xxxx在线观看| 欧美 日韩 精品 国产| 最近中文字幕2019免费版| a级毛片免费高清观看在线播放| 蜜桃久久精品国产亚洲av| av网站免费在线观看视频| 欧美另类一区| 欧美激情极品国产一区二区三区 | 丰满乱子伦码专区| 麻豆精品久久久久久蜜桃| 小蜜桃在线观看免费完整版高清| 精品熟女少妇av免费看| a级毛片免费高清观看在线播放| 在线亚洲精品国产二区图片欧美 | 国产淫片久久久久久久久| tube8黄色片| 国产白丝娇喘喷水9色精品| 一本色道久久久久久精品综合| 少妇人妻精品综合一区二区| 国产毛片在线视频| 国产一区二区在线观看日韩| 毛片一级片免费看久久久久| 日本欧美视频一区| 午夜福利在线在线| 免费观看无遮挡的男女| 一个人免费看片子| 国产精品国产三级专区第一集| 嫩草影院新地址| 人人妻人人看人人澡| 五月玫瑰六月丁香| 丝袜喷水一区| 日日撸夜夜添| 亚洲美女搞黄在线观看| 看非洲黑人一级黄片| 美女高潮的动态| 久久精品人妻少妇| 麻豆成人午夜福利视频| 三级国产精品欧美在线观看| 高清欧美精品videossex| 黄色日韩在线| 欧美成人a在线观看| 久久鲁丝午夜福利片| 超碰av人人做人人爽久久| 人妻 亚洲 视频| 精品午夜福利在线看| 精品亚洲乱码少妇综合久久| 日韩伦理黄色片| 免费看av在线观看网站| 亚洲精品,欧美精品| 国产黄色视频一区二区在线观看| 国产女主播在线喷水免费视频网站| 一本久久精品| 精品国产一区二区三区久久久樱花 | 最近手机中文字幕大全| 欧美区成人在线视频| 欧美成人a在线观看| 亚洲最大成人中文| 亚洲色图综合在线观看| 亚洲国产日韩一区二区| 天美传媒精品一区二区| 麻豆成人av视频| 成人18禁高潮啪啪吃奶动态图 | 成年免费大片在线观看| 自拍偷自拍亚洲精品老妇| 97精品久久久久久久久久精品| 永久免费av网站大全| 热99国产精品久久久久久7| 九九在线视频观看精品| 亚洲综合精品二区| av.在线天堂| 亚洲精品国产av成人精品| av福利片在线观看| 精品久久久精品久久久| 你懂的网址亚洲精品在线观看| 日韩欧美精品免费久久| 黑丝袜美女国产一区| 在线观看免费高清a一片| 免费大片黄手机在线观看| 最近中文字幕高清免费大全6| 人妻一区二区av| 深爱激情五月婷婷| 91狼人影院| 国产精品精品国产色婷婷| 97超碰精品成人国产| 噜噜噜噜噜久久久久久91| 国产成人午夜福利电影在线观看| 在线观看国产h片| 亚洲人成网站在线观看播放| 久久影院123| 亚洲人成网站在线观看播放| 久久影院123| 日韩成人伦理影院| 观看美女的网站| 精品少妇黑人巨大在线播放| 精品少妇久久久久久888优播| 精品少妇黑人巨大在线播放| 嫩草影院新地址| 久久久久国产网址| 亚洲aⅴ乱码一区二区在线播放| 国产欧美亚洲国产| 六月丁香七月| 插逼视频在线观看| 18禁裸乳无遮挡免费网站照片| 成人亚洲欧美一区二区av| 六月丁香七月| 日本免费在线观看一区| 老司机影院成人| 亚洲av日韩在线播放| 老司机影院成人| 91久久精品电影网| 韩国av在线不卡| 久久99蜜桃精品久久| 超碰97精品在线观看| 少妇人妻一区二区三区视频| 国产欧美日韩一区二区三区在线 | 深夜a级毛片| 国产精品蜜桃在线观看| 99久久精品热视频| tube8黄色片| 欧美精品国产亚洲| 久久毛片免费看一区二区三区| 国产高清国产精品国产三级 | 欧美日韩国产mv在线观看视频 | 国产中年淑女户外野战色| 亚洲精品乱码久久久久久按摩| 午夜激情福利司机影院| 久久国产乱子免费精品| 99久久人妻综合| 最近最新中文字幕免费大全7| 久久久a久久爽久久v久久| 久久久久视频综合| 国产欧美日韩精品一区二区| 成人18禁高潮啪啪吃奶动态图 | 午夜精品国产一区二区电影| 日本vs欧美在线观看视频 | 国产精品一区二区性色av| 亚洲欧美中文字幕日韩二区| 只有这里有精品99| 亚洲精品国产av蜜桃| 菩萨蛮人人尽说江南好唐韦庄| 老女人水多毛片| 高清av免费在线| 久久久欧美国产精品| 成人漫画全彩无遮挡| 日本黄色日本黄色录像| 亚洲国产最新在线播放| 纵有疾风起免费观看全集完整版| 韩国av在线不卡| 亚洲欧美日韩无卡精品| 久久久久久久久久久丰满| 爱豆传媒免费全集在线观看| av视频免费观看在线观看| 国产精品女同一区二区软件| 亚洲成色77777| 我要看黄色一级片免费的| 欧美精品一区二区大全| 99热这里只有是精品在线观看| 美女主播在线视频| 免费黄频网站在线观看国产| 亚洲国产欧美在线一区| 欧美日本视频| 欧美zozozo另类| 99视频精品全部免费 在线| 男人爽女人下面视频在线观看| 婷婷色麻豆天堂久久| 蜜桃亚洲精品一区二区三区| 国产成人91sexporn| 在线观看一区二区三区激情| 舔av片在线| 成人高潮视频无遮挡免费网站| 成人无遮挡网站| 国产成人精品久久久久久| 最近2019中文字幕mv第一页| 日本av免费视频播放| 久久精品国产a三级三级三级| 国产欧美另类精品又又久久亚洲欧美| 欧美高清成人免费视频www| av国产精品久久久久影院| 最近中文字幕2019免费版| 亚洲欧美中文字幕日韩二区| 亚洲熟女精品中文字幕| 最近最新中文字幕大全电影3| 小蜜桃在线观看免费完整版高清| 日韩欧美一区视频在线观看 | 日韩中字成人| 女的被弄到高潮叫床怎么办| 国产亚洲5aaaaa淫片| 国产无遮挡羞羞视频在线观看| 黄片无遮挡物在线观看| 国产精品熟女久久久久浪| 成人毛片60女人毛片免费| 在线 av 中文字幕| 高清黄色对白视频在线免费看 | 久久久久网色| 最近中文字幕2019免费版| 成人无遮挡网站| 亚洲色图av天堂| av国产精品久久久久影院| 高清视频免费观看一区二区| 久久久精品94久久精品| 在线观看免费视频网站a站| 久久人人爽人人爽人人片va| 国产一区有黄有色的免费视频| 日韩制服骚丝袜av| 国产爽快片一区二区三区| 国产一区二区在线观看日韩| 蜜桃亚洲精品一区二区三区| 国产真实伦视频高清在线观看| 建设人人有责人人尽责人人享有的 | 日韩av不卡免费在线播放| 国产在视频线精品| 美女高潮的动态| 久久99蜜桃精品久久| 蜜桃亚洲精品一区二区三区| 美女主播在线视频| 欧美成人精品欧美一级黄| 日韩中字成人| 蜜桃亚洲精品一区二区三区| 亚洲成人中文字幕在线播放| 蜜桃亚洲精品一区二区三区| 成人黄色视频免费在线看| 亚洲成人一二三区av| 人妻制服诱惑在线中文字幕| 亚洲熟女精品中文字幕| 人妻少妇偷人精品九色| 精品人妻偷拍中文字幕| 国产美女午夜福利| 国产永久视频网站| 国产精品嫩草影院av在线观看| 亚洲av日韩在线播放| 亚洲欧美中文字幕日韩二区| 久久热精品热| 狂野欧美激情性bbbbbb| 亚洲精品日韩在线中文字幕| 少妇人妻精品综合一区二区| 这个男人来自地球电影免费观看 | 亚洲在久久综合| 天天躁日日操中文字幕| 国产男女超爽视频在线观看| 国语对白做爰xxxⅹ性视频网站| 精品熟女少妇av免费看| 亚洲国产色片| freevideosex欧美| 天堂俺去俺来也www色官网| 亚洲成人中文字幕在线播放| 中文字幕亚洲精品专区| 久久国产乱子免费精品| 亚洲国产成人一精品久久久| 亚洲成人中文字幕在线播放| 久热久热在线精品观看| 3wmmmm亚洲av在线观看| 狠狠精品人妻久久久久久综合| 精品国产三级普通话版| 亚洲不卡免费看| 日本wwww免费看| 国产亚洲av片在线观看秒播厂| 我的女老师完整版在线观看| 亚洲av成人精品一区久久| 久久久国产一区二区| 80岁老熟妇乱子伦牲交| 国产高清国产精品国产三级 | 偷拍熟女少妇极品色| 久久精品国产亚洲av涩爱| 亚洲精品久久午夜乱码| 网址你懂的国产日韩在线| 能在线免费看毛片的网站| 欧美三级亚洲精品| 亚洲国产精品成人久久小说| 亚洲精品视频女| 天堂俺去俺来也www色官网| 婷婷色综合大香蕉| 久久精品国产a三级三级三级| 最近最新中文字幕免费大全7| 成人美女网站在线观看视频| 亚洲自偷自拍三级| 国产在线视频一区二区| 男人爽女人下面视频在线观看| 久久精品国产自在天天线| 久热这里只有精品99| 国产成人a∨麻豆精品| 亚洲一区二区三区欧美精品| 国产大屁股一区二区在线视频| 一级毛片久久久久久久久女| 身体一侧抽搐| 久久久久久久国产电影| 精品一区在线观看国产| 国产有黄有色有爽视频| 国产一级毛片在线| 国产精品三级大全| 亚洲成色77777| 一边摸一边做爽爽视频免费| 爱豆传媒免费全集在线观看| 欧美日本中文国产一区发布| 国产精品99久久99久久久不卡| 国产精品秋霞免费鲁丝片| 18禁国产床啪视频网站| 99国产精品一区二区三区| 欧美日韩福利视频一区二区| 久久毛片免费看一区二区三区| 人妻人人澡人人爽人人| 一区在线观看完整版| 久久精品aⅴ一区二区三区四区| 国产成人精品久久二区二区免费| 免费人妻精品一区二区三区视频| 欧美精品高潮呻吟av久久| 啦啦啦在线免费观看视频4| 久久免费观看电影| 丁香六月天网| av国产精品久久久久影院| 日韩av不卡免费在线播放| 国产野战对白在线观看| 久久久久久久精品精品| 欧美精品人与动牲交sv欧美| 精品人妻一区二区三区麻豆| 91老司机精品| 纵有疾风起免费观看全集完整版| 国产在线一区二区三区精| 天堂8中文在线网| 男的添女的下面高潮视频| 人人妻,人人澡人人爽秒播 | 午夜福利一区二区在线看| 国产在视频线精品| av福利片在线| 亚洲成人国产一区在线观看 | 波多野结衣一区麻豆| 国产av一区二区精品久久| 国语对白做爰xxxⅹ性视频网站| 久9热在线精品视频| 另类亚洲欧美激情| 日韩免费高清中文字幕av| 免费在线观看黄色视频的| 熟女av电影| 手机成人av网站| 极品人妻少妇av视频| 精品人妻熟女毛片av久久网站| 一级a爱视频在线免费观看| 少妇的丰满在线观看| 亚洲av日韩精品久久久久久密 |