• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A Hybrid Feature Selection Framework for Predicting Students Performance

    2022-11-09 08:17:14MaryamZaffarManzoorAhmedHashmaniRajaHabibKSQuraishiMuhammadIrfanSamarAlqhtaniandMohammedHamdi
    Computers Materials&Continua 2022年1期

    Maryam Zaffar,Manzoor Ahmed Hashmani,Raja Habib,KS Quraishi,Muhammad Irfan,Samar Alqhtani and Mohammed Hamdi

    1High Performance Cloud Computing(HPC3),Department of Computer and Information Sciences,Universiti Teknologi PETRONAS,32610,Seri,Iskandar,Malaysia

    2Department of Computer Science and Information Technology,University of Lahore,Pakistan

    3Department of Process Engineering,Pakistan Institute of Engineering&Applied Sciences,Nilore,Islamabad,Pakistan

    4Electrical Engineering Department,College of Engineering,Najran University,Najran,61441,Saudi Arabia

    5College of Computer Science and Information Systems,Najran University,61441,Najran,Saudi Arabia

    Abstract:Student performance prediction helps the educational stakeholders to take proactive decisions and make interventions,for the improvement of quality of education and to meet the dynamic needs of society.The selection of features for student’s performance prediction not only plays significant role in increasing prediction accuracy,but also helps in building the strategic plans for the improvement of students’academic performance.There are different feature selection algorithms for predicting the performance of students,however the studies reported in the literature claim that there are different pros and cons of existing feature selection algorithms in selection of optimal features.In this paper,a hybrid feature selection framework (using feature-fusion) is designed to identify the significant features and associated features with target class,to predict the performance of students.The main goal of the proposed hybrid feature selection is not only to improve the prediction accuracy,but also to identify optimal features for building productive strategies for the improvement in students’academic performance.The key difference between proposed hybrid feature selection framework and existing hybrid feature selection framework,is two level feature fusion technique,with the utilization of cosine-based fusion.Whereas,according to the results reported in existing literature,cosine similarity is considered as the best similarity measure among existing similarity measures.The proposed hybrid feature selection is validated on four benchmark datasets with variations in number of features and number of instances.The validated results confirm that the proposed hybrid feature selection framework performs better than the existing hybrid feature selection framework,existing feature selection algorithms in terms of accuracy,f-measure,recall,and precision.Results reported in presented paper show that the proposed approach gives more than 90%accuracy on benchmark dataset that is better than the results of existing approach.

    Keywords: Educational data mining;feature selection;hybrid feature selection

    1 Introduction

    Education is one of the main pillars of society.It polishes the character and intelligence of students.Current education system may not be suitable for the new and dynamic needs of the society.One major aspect of the new paradigm of the education system is to predict student performance beforehand.As the students are the main stakeholders of the educational systems,therefore by analyzing the student’s data and developing different predictions from it,academic organizations may meet the dynamic needs of the society.Moreover,the results of predictions can be helpful for making strategies to improve the quality of education.The better-quality education supports in building the skillful and featureful students.This gives the attention to analyze the academic data.Student performance prediction models help in analyzing the student data with the help of different data mining techniques.Furthermore,to facilitate student performance prediction,many student performance prediction models have been proposed.Student performance prediction models have received a significant amount of contemplation from both the research community and educational sector.Student performance prediction models tackle the problem of prediction of student grades [1],GPA (Grade Point Average) [2],CGPA [3],and Pass/Fail Course [4].The goal of students’performance prediction models in EDM (Educational Data Mining) is not only to achieve the high accuracy of prediction models but also to help the educational stakeholders in predicting the performance of students.The students are the main assets of any community and the main aim of any academic organization is to provide the quality education to its students.Moreover,the quality education supports in building the skillful and featureful students.Student performance prediction models help in analyzing the student data with the help of different data mining techniques.A lot of work has been done on the development of students’performance prediction models.There are two main methods of developing student performance prediction models.One is supervised,and another is the unsupervised method.Classification is a type of supervised learning method.According to [5] around 71.4% of research articles on students’performance prediction models are using a classification method.It is the top method for the performance prediction models [6].In the classification method,the target variable is clearly defined as that which we want to predict whether grades,GPA,CGPA,or students PASS/FAIL.This motivated us to focus on the students’performance prediction model with the help of the classification method.

    Feature selection can play a prominent role in enhancing the accuracy of a prediction model.In student’s prediction model,the selected features play an important role not only in increasing the prediction accuracy but also in providing the base for the strategic plans for the educational environment.According to [7] information gain attribute evaluator is the best feature selection technique to improve the effectiveness of student prediction model.Whereas,[8] claims CFS subset evaluator as the best feature selection method for predicting the final semester examination performance of students.According to [9] there is not one common feature selection method which can be accurate for all datasets even for a common domain.There is a need to focus on the feature selection algorithms in the area of predicting the performance of students.The third main type of feature selection is hybrid feature selection,which combines the advantages of filter and wrapper feature selection.Unfortunately,there is only a single framework for hybrid feature selection for EDM [10].The importance of feature selection methods in predicting students’performance,motivated us to develop a feature selection framework for students’performance prediction with better prediction accuracy.Furthermore,the design of existing hybrid feature selection framework also motivated us to focus on hybridization of feature selection algorithms to build a robust feature selection framework for student performance prediction.

    Contributions: Followings are the contributions of this research in the domain of Educational Data Mining.

    (a) First different benchmark datasets have been used to predict the student’s performance using feature selection.

    (b) Second,importance of hybrid feature selection has been explored by comparing the results of hybrid feature selection algorithm,filter and wrapper on various students’benchmark datasets.

    (c) Limited work was done in students’performance prediction using hybrid feature selection.

    (d) A novel hybrid feature selection framework is proposed to predict the performance of students,with better results than existing hybrid feature selection method [10].

    A lot of work has been done on the development of students’performance prediction models.But the study of student’s prediction models is still inadequate in predicting the performance of students [11,12],especially in terms of prediction accuracy.So,this motivated us to focus on developing a feature selection framework for predicting the performance of students.Furthermore,this motivated us to build a feature selection framework that can be helpful for building a student performance prediction model,to help the educational stakeholders.This will not only be helpful for academic organizations to build strategic plans accordingly but also with the help of proposed hybrid feature selection framework,we can go to the next level of the education system which fulfills the needs of the current and futuristic society.

    2 Literature Review

    Improving the quality of education is one of the challenges for the educational institutions,the improvement in the quality of education is not only required for assembling a higher level of knowledge,but also providing effective facilities of education that can help students in achieving their academic objectives without any problem [13-15].Identification of factors affecting the performance of students is very important to improve the quality of education [16].Student performance prediction models help the educational institutions to increase the quality of education by analyzing the student’s data to make the academic strategic plans for the improvement of the student’s academic performance [17].However the study on student performance prediction is still insufficient [11].The performance of student prediction model mainly depends on the selected features from the under-considered dataset [18].Feature selection helps in identification of suitable features from a dataset and hence very important for the student performance prediction models [19-23].The main focus of existing feature selection methods in EDM is to improve the prediction accuracy of student performance prediction model henceforth,only focusing on the feature’s association with the target class.There are mainly two types of feature selection algorithms,filter,and wrapper feature selection algorithm.The main focus of existing student performance prediction models [24-28] is on using filter feature selection algorithm,and these existing feature selection algorithms have issues of ignoring dependencies and associative features (interaction of features with the classifier) [29].The emphasis of existing research on student performance prediction using feature selection is on reducing the number of features to improve the prediction accuracy of the model.The two main types of feature selection algorithms,filter and wrapper both have different pros and cons.Hybrid feature selection takes the advantages of both filter and wrapper feature selection approaches [30].Hybrid feature selection IFSFS [30],is the hybridization of filter and wrapper feature selection algorithms and was proposed to diagnose the erythema to-squamous diseases.Hybridization of SU (Filter feature selection) and backward search strategy as a wrapper has various applications including hypertension diagnosis [31,32],prediction of the type of cancer in a cancer patient [33,34],bioinformatics [35],credit scoring [36,37] as well as in other domains [38].The existing hybrid feature selection models in different domains of research try to retrieve the optimal features to obtain high prediction accuracy.But they have the foremost limitation in the flow of feature identification,as the features ignored in such hybrid feature selection methods are neither be evaluated in other levels.To the best of our knowledge,there exists a hybrid feature selection framework in EDM to predict the performance of students [39].The existing hybrid feature selection is the combination of FCBF (filter feature selection) and SFS (wrapper feature selection),but it has the limitation of ignoring feature dependencies,and ignorance of highly associated features in the first phase of hybridization,and the problem in the flow of hybridization strategy,as one feature removed can never be evaluated further in the hybridization flow.

    The identification of features from the student performance prediction to help the educational stakeholders is still a problem [40].The reason for this is that the existing features selection algorithms lack in optimal identification of features.Majority of approaches in student performance prediction are based on the filter feature selection,hence the chance of ignoring uniquely associated features with the target class is high.The importance of hybridization in terms of utilizing the advantages of filter and wrapper feature selection gives the motivation to build a hybrid feature selection framework to obtain optimal features.The selected features for predicting the performance of students plays a vital role in building the strategic plans for the improvement in the quality of education,which in return can result in positive changes in the performance of students.So,the features identified through the educational datasets must not only be associated with the target class,but they must also be significant.The importance of feature significance and association with the target emphasizes on the integration of such type of features in a student dataset.It is necessary to remove the redundant features from a dataset,as well as keeping the associated and significant features in focus.Also,there may be features that can have the significant as well as associated feature properties,and these features must not be ignored during feature selection.Ignoring an optimal feature may lead to non-productive strategic plans for the improvement in the quality of education.

    3 Methodology

    The Fig.1 describes the main process of the proposed optimized feature selection method for predicting the performance of students.The main phases of the proposed method are the identification of significant features and identification of highly associated features with the target class.The significant features and highly associated features are fused into a new hybrid feature vector by using early level feature fusion technique.The cosine feature selection equation is formulated to calculate the weights of significant features and the highly associated features.The proposed optimized feature selection method has given the concept of selecting the significant feature,associated feature,and hybrid feature.Whereas the proposed method defines hybrid features that are not only significant but also have an association with the target class.To obtain the optimized features,the main steps of the hybrid feature selection framework using feature-level strategy are listed below along with a brief description.

    Figure 1:Hybrid feature selection framework with feature fusion

    3.1 Identification of Significant Features

    Filter feature selection algorithm is performed in this step.The details of the step are explained in coming sections.Chi-square feature selection is used to statistically test the independence of a feature with the class label.It is being computed in different prediction models [41,42]to predict the student’s performance.In the proposed approach the chi-square feature selection algorithm is adjusted to compute the test of independence of the feature sfviand class scj.

    Then the feature sfviand the class scjare independent.This means that the feature sfvidoes not contain any category information.Larger values of X2(sfvi,scj),indicates the importance of category information the feature sfviowns.Chi-square formula is presented through Eq.(2).

    N is the total no of instances (students).The rijis the frequency that the feature sfviand the category scj.pij is the frequency that features sfvi occurs and does not belong to category scj.cij is the frequency that category ci occurs and cannot contain feature sfvi.qij shows number of times neither sci nor sfvioccur.So that mathematical equation of feature vector containing significant features Edf is presented through Eq.(3).

    Whereas Edf contains all sfv having the value of X2(sfvi,scj) greater than zero.Whereas Eq.(4) presents the feature vector containing associated features.

    3.2 Identification of Associated Features(Wrapper)

    The second step is the identification of features associated with the target class.This step not only identifies the features with a high association with the target class,but also the dependencies between the features.This is not only important for the students but also for the teachers,as they may guide teachers to improve their capabilities in order to increase the quality of education [43].To identify the associated features,SFS wrapper feature selection is computed.SFS feature selection is a heuristic search algorithm,that start with an empty set [44].Each of the features in feature matrixSDmis evaluated through SFS feature selection,wrapped by the SVM classification algorithm.Each of the features does the 10-cross-validation and calculates the average accuracy of the 10-cross-validation.The highest accuracy that is the least minimum of the functions that determines whether the evaluated feature should be added to the feature association vector.To identify the associated features with the target class,SFS wrapper feature selection is computed.SFS feature selection is a heuristic search algorithm,that start with an empty set [44].Each of the features in feature matrix SDmis evaluated through SFS feature selection,wrapped by the SVM classification algorithm.Each of the features does the 10-crossvalidation and calculates the average accuracy of the 10-cross-validation.The highest accuracy that is the least minimum of the functions that determines whether the evaluated feature should be added to the feature association vector.The features selected by sequential forward search (SFS)in each of the rounds are evaluated through the wrapped classifier SVM.The features with high prediction accuracy in each of the round are selected.In order to avoid the overfitting issue,data is divided into 10 equal folds by 10-cross-validation.The feature with high accuracy in 10-folds is highly associated with the target class.And on each of the round,the selected features are added in Eaf feature vector.The feature vector Eaf,contains the features associated with the target class,and is represented through Eq.(5):

    3.3 Fusion of Significant and Associated Features Using Early-Level Feature Fusion Technique

    The significant and associated features are fused using the early-level feature fusion strategy.The academic decisions based on these features can play a vital role in the improvement of quality education.Fusion of features is performed at two levels.

    i.Level 1: Identification of projected features using cosine weighting.

    ii.Level 2: Identification of highly associated features.Fusion is termed as the integration of different types of features in the process of feature selection [45].There are different types of fusions,data fusion,decision fusion,and feature-level fusion.The main task of the proposed approach is related to features,therefore feature fusion is computed in the proposed approach,and furthermore,feature fusion is computed in different domains due to its simplicity.Feature Fusion is a technique in which different feature sets are fused into a single feature set/presentation The main advantage of feature fusion is that the new union feature not only keeps the information about the feature but also eliminates the redundant information to a certain degree [46].

    The selection of feature-level fusion helps in computing the hybrid feature selection mainly in two folds.

    a) The main intention of proposed work is to develop a feature selection method that may identify the most dominating factors affecting the performance of students,feature-level fusion has the ability to derive the most important features from the feature sets involved in the fusion [45,47].Therefore,taking this advantage in the account,the proposed approach adapted feature fusion.

    b) Feature fusion can eliminate the redundant features [47].As the redundant feature might affect the prediction accuracy of student performance models.So,this might help in elevating the prediction accuracy of the hybrid feature selection frame for student performance prediction.

    Early-level fusion and late-level fusion are two main feature fusion strategies.However latelevel fusion is expensive in terms of learning efforts as it requires learning algorithm on each of the steps.Whereas late feature fusion also has an issue of the potential loss of correlation in fused space [48].Combining the two feature vectors for prediction models is a challenging task.Early-level fusion is one of the feature-level fusion strategies to perform concatenation of two feature sets in a common feature vector [49].In sum,the feature obtained through late-level feature fusion are highly associated with the target class.Whereas the main focus of the proposed approach is not only highly associated features with the target class but also feature dependencies and significance of features are also take into account.As to make the proactive decision for the improvement in students’performance and building different academic strategical plan the student data must be analyzed properly.So that early-level fusion strategy is adapted to fuse the significant feature vectorEdfand associated feature vectorEaf.This may lead towards the optimal selection of features for predicting the academic performance of students.Cosine similarity measure is used to calculate the similarity between the two vectors [50].Similarity between the two vectors is computed by the cosine of the angle between the two vectors.There are different approaches for predicting the similarity between two vectors,these approaches include,cosine similarity,Jaccard coefficient,Spearman distance,etc.Out of all above-mentioned approaches cosine similarity is proven to work best [51-53],and also have retrieval effectiveness than other similarity measures.Whereas the existing similarity measures have the drawback that they give dominance to largest scale feature,and also existing similarity measures are sensitive to outlier [54].Furthermore,the existing similarity measure is not the best choice when the similarity relations are complex [52].Cosine similarity is used to measure the similarity between two vectors in different domains like pattern recognition face recognition [55],text classification [56],search engines [57].Cosine similarity weights are computed to identify the optimal features and fusion of features vectors by tuning the parameters of cosine similarity measure in the proposed hybrid feature selection framework.The weights are given to different features based on the similarities between these features.The fusion of the feature vectorsEdfandEafis computed in two levels.Fig.2 reflects the whole fusions step in proposed hybrid feature selection framework.

    Figure 2:Flow of fusion steps in hybrid feature selection framework

    3.3.1 Level1:Identification of Projected Features Using Cosine Weighting

    In this section,the level 1 of feature fusion technique for hybrid feature selection framework is explained in detail.The cosine similarity measure is the best choice as compared to the other similarity measures because of its ability of effectiveness and dealing with complex similarities.Fig.3 shows a block diagram of the process of identification of hybrid features by fusing significant feature vectorEdfand associated feature vectorEafusing cosine weights cpfw.

    In the first step,the projected featurespare identified from the feature vectors.Whereas it is assumed that the projected features are defined as the features having a projection inEdfandEaf.So,hybrid features are highly important features as they may have highly associated with the target class as well as significant.Initialize a feature vector daf as an empty feature vector.Whereas daf is denoted as feature vector contacting projected features.Letdaf=Φ

    Referring to section III.A,Eq.(4),Edfpresents a feature vector containing significant features.

    Edf={f d1,f d2,f d3...,f dn}

    Figure 3:Block diagram of level1 of projected feature selection framework using fusion

    Referring to section III.B,Eq.(5),Eafpresents feature vector containing associated features.

    Eaf={f a1,f a2,f a3,....,f an}

    Cosine similarity weights are introduced to identify the projected featurespfwith the fusionEdfandEaf.The similarity between the two feature vectors can be measured using cosine similarity technique [58].Eq.(6) presents the cosine similarity equation for the identification of projected features.

    whereEdf,Eaf={f d1.f a1,f d2.f a2.....,f dn.f an} and values ofsim (fai,fdi )will either be 0,or 1.

    a) If sim (fdi,fai)==0 thencpfw=0 andfdi=cpfw &fai=cpfw.Furthermore,fdiandfaiare ignored in this step.

    b) Ifsim(fdi,fai)==1 thencpfw=1 andfdi=cpfw&fai=cpfw

    Therefore,this feature is added as a projected feature in projected feature vectordaf.

    The above mentioned point a,presents that if the similarity between two features sayfdi,and faifrom feature vectorEdfandEafis 0,then it indicates that feature is not similar so that they are ignored.Whereas line 2 presents that if the similarity between two features sayfdi,and faifrom feature vectorEdf and Eafis 1,then it indicates that feature is similar so that they are added to projected feature vectordaf.So that projected feature vectordafcontains all the projected feature withcpfw=1.

    3.3.2 Level 2: Identification of Highly Significant Feature (Edf +) and Identification of Highly

    Associated Features(Eaf +)

    The level 2 of feature fusion step in proposed hybrid feature selection framework identifies the highly significant and highly associated feature with the fusion of significant feature vector Edf,associated feature vector Eaf,and projected feature vector pf feature vectors using cosine feature weights.Whereas Fig.4 explains the identification of Edf+and Eaf+by using cosine feature weights.Basically,the level 2 of feature fusion in hybrid feature selection framework further consists of two steps,in first step uniquely significant features are identified.Whereas in the second step uniquely associated features are identified.

    Figure 4:Block diagram of level 2 of hybrid feature selection framework using fusion

    Uniquely significant feature identification:Edf+is initialized as an empty set.WhereasEdf+is considered as a feature vector,containing all uniquely significant features,especially these features are not projected over associated features.Referring to equation,Edfpresents a feature vector containing significant features.

    Edf={f d1,f d2,f d3...,f dn}

    dafpresents projected feature vector,andcpfwis the weight of projected features indaf.fusion ofEdfanddaf using csfw.whereascsfwis the cosine weight for uniquely significant features.Eq.(7) presents the cosine weight for uniquely significant features.

    where Eq.(8) presents the cosine weight for uniquely significant feature identified fromEdfanddaf.

    The values ofcsfwwill either be 0 or 1.

    a) Ifcsfw==1 thenfdi=csfw,so that featurefdiis ignored and not added toEdf+.

    b) Ifcsfw==0 thenfdi=csfw,so thatfdiis added inEdf+feature vector.

    Line 1 shows that if the value ofcsfwbecome 1,then it shows the similarity between the features inEdfand feature indaf.So that feature is not considered as a uniquely significant feature.Whereas line 2 shows that if the value ofcsfwis 0,the feature has no projection,so that is considered a uniquely significant feature,and hence added toEdf+.In sum,the projections ofdaffeature vectors are compared withEdfin this step.The similarity between the weights of theEdfanddafis checked in such a way that features having no projections with associated feature vectors are added in theEdf+feature vector.

    Edf+=Feature vectors contains uniquely significant features.

    Uniquely associated feature identification:Eaf+is initialized as an empty set.WhereasEaf+is considered as a feature vector,containing all uniquely associated features.dafis compared withEaf.The similarity between the features of theEafanddafis checked in such a way that the features having similarity will be computed inEaf+feature vector.Referring to section III.B,Eq.(5),Eafpresents feature vector containing associated features.

    Eaf={f a1,f a2,f a3,....,f an}

    Fusion ofEafanddaf using cafw.Whereascafwis the cosine weight for the uniquely associated features.The Eq.(9) presents the mathematical equation for calculating the uniquely associated features.

    where Eq.(10) presents the cosine weight for identifying the ith feature.

    And values ofcafweither 0,or 1.i.Ifcafw==1 andfai=cafw.Furthermore,faianddafiare ignored,ii.If casfw==0 andfai=cafw,thenfaiis added inEaf+feature vector.Eaf+=Feature vectors having features uniquely associated features with the target class.In sum line 1 shows that if the features are projected on significant features with the target class,then they are neglected and not added to theEaf+.Whereas line 2 shows that the features are considered as uniquely associated as having no projections on significant features,and hence such features are added toEaf+.As a result of level 1,level 2 of feature fusion three types of feature vectors are identified Edf+,Eaf+,and daf.

    3.4 Model Training

    The features are further training using SVM classification algorithm.SVM is selected in the proposed approach due to its high generalization ability and history of achieving high accuracy in datamining [59].10-fold cross validation is performed to evaluate the robustness of proposed hybrid feature selection framework,and high-frequency feature matrix is obtained by applying the frequency criterion.The hybrid features are divided into ten folds using 10-fold cross-validation.Whereas the model is trained using the SVM classification algorithm.The SVM classification algorithm is being used due to its flexibility in dealing with educational parameters in prediction models [60,61].SVM linear kernel is adjusted with the help of the optimization function of the linear kernel function.The Eq.(11) presents SVM linear kernel function is [62]:

    The kernel functions have the ability to whereas,SVM kernel functions have the ability to transform the dataset space into a high dimension.Each of the kernels has optimized function to obtain high performance [63].For SVM linear kernel the penalty value C is an optimized function.The value of C is optimized to obtain a better classification prediction for the proposed approach.Furthermore,the selected features are trained on SVM linear kernel and then tested and evaluated through different evaluation measures.The detailed of each of the evaluation measure is explained in next section.

    3.5 Evaluation Measures

    The performance of the proposed approach is measured through prediction accuracy,precision,recall,and f-measure.These evaluation matrices are widely used in different domains such as information retrieval,machine learning,sentiment analysis and EDM [34,61].Let D be a student dataset,containing “n” number of features for “m” number of students.Let SDm be n-size student data feature matrix;the size of the feature vector for each example within data matrix SDm is “n” and “m” is the number of examples.Each feature of the vector contains data related to the students’information relevant to his/her educational activity.

    3.6 Size of S Feature Vector=Size of Feature Vector

    Dimension of feature matrix=Number of examples in dataset D.

    Hybrid feature selection framework using fusion is evaluated on prediction accuracy,precision,recall,and f-measure evaluation measures.The detail of this topic is explained further,Prediction Accuracy: Accuracy is the ratio between the correct predictions.The Eq.(13) shows the accuracy formula for evaluation of hybrid feature selection framework with fusion.It is used to measure the effectiveness of the prediction model.However how minority classification of minority classes cannot be shown by accuracy evaluation measure.Also,accurately predicting the positive outcome is not adequate.

    Recall and Precision:As a good prediction model must have successful positive and successful negative predictions as well.Henceforth precision and recall evaluation measure is also used to evaluate the proposed hybrid feature selection framework.Eqs.(14) and (15) present the recall and precision calculations for the evaluating the proposed hybrid feature selection framework.

    F-measure:It considers both precision and recall.The results are also evaluated through the,to get the classification of instances with respect to the target class.equation presents the mathematical Eq.(16) for calculating f-measure.In sum,these evaluation measures can give a deeper insight into the performance of the proposed hybrid feature selection framework.So that proposed hybrid feature selection is not only validated in terms of accuracy,but also in terms of precision,recall,and f-measure.This section presented the proposed hybrid feature selection framework using fusion.Each level of the proposed framework is discussed in detail.The methodology of identification significance features,identification of associated features and identification of projected features towards significance and associated features is discussed in detail.Furthermore,the cosine-based feature fusion through cosine weighting is explained in detail.This section also discusses the model training and evaluation measures that evaluate the proposed approach.The next section presents the simulation results of the proposed approach on benchmark students’datasets.

    4 Result and Discussion

    To check the robustness of a hybrid feature selection,a dataset with a varying number of features and instances are required.Henceforth to empirically evaluate the proposed hybrid feature selection framework using fusion,four benchmark datasets of students’academic records from different educational domains are selected,to check the robustness,as the robustness in feature selection can be evaluated through variations in the number of instances or variations in the number of features [64].These four datasets sets are benchmark datasets and are publicly available.The dataset acquired from different databases have different attributes from each other,hence presents a different set of challenges which have not been studied altogether previously.Four different student’s benchmark datasets have been used in the proposed research,due to their diversity in nature of datasets in terms of a number of features,no of instances and belonging to different areas of education,to show the robustness of the proposed hybrid feature selection framework for student performance prediction.

    The Tab.1 presents a brief description of four benchmark datasets,related to variations in number of instances,and number of features.The number of instances in student datasets actually presents the number of records in a particular dataset.Tab.1 shows the number of instances in Math,LMS,CS,and PLang datasets.PLang dataset contains 649 number of instances (records of students) that are greater in a number of instances than other three datasets.Furthermore,Tab.1 shows that CS dataset contains the smallest number of instances as compared to other datasets.The reason for selecting datasets having different instances and different attributes is to evaluate the proposed framework in terms of its performance being robust or otherwise.Tab.1 shows the number of features in Math,LMS,CS,and PLang dataset.It shows that Math and PLang dataset contains 32 number of features,however,these two datasets have a different number of instances as shown in Tab.1.Furthermore,LMS dataset contains 16 features,and CS contains the smallest number of features as compared to the other three datasets.Hence,Math,LMS,CS and PLang dataset different from each other in terms of a number of students’records present in them and number of features/attributes in these four datasets.

    Table 1:Dataset’s description in brief

    Simulation Environment:The simulations to implement proposed hybrid feature selection framework for EDM were conducted on machine incorporated with core i5.Python 2.7 version is used as an editor,whereas PYcharm Edu IDE was set up as the development environment.

    4.1 Prediction Accuracy of Proposed Hybrid Feature Selection Framework

    To validate the performance of the proposed hybrid feature selection framework accuracy of hybrid feature selection framework is evaluated.Accuracy is defined as the fraction of correctly predicted observations to the total observations [65-67].The model with better accuracy is considered as the best prediction model [68].So that accuracy of the proposed approach will give the ratio of correctly classified students in a pass or fail a class,over the total number of students.Accuracy gives the overall effectiveness of the proposed hybrid feature selection framework.Furthermore,accuracy gives effectiveness over existing feature selection framework and feature selection algorithm by comparing the results of accuracy on benchmark students’datasets of proposed feature selection framework with existing feature selection framework,and feature selection algorithms.In this section,the accuracy of proposed hybrid feature selection framework is compared with existing feature selection framework,and other feature selection algorithms like FCBF,Information Gain,and CFS,feature selection algorithm and with proposed hybrid feature selection framework.

    Referring to Tab.2,the results in Fig.5 presents the comparison of prediction accuracy of existing Feature selection framework with the proposed hybrid feature selection framework [39].The red bar shows the proposed framework and the black bar shows the existing hybrid feature selection framework [39] in Fig.5.The x-axis shows the results four benchmark datasets and the y-axis shows the percentage value of prediction accuracy on four benchmark datasets.It is clearly observed that the prediction accuracy of the proposed feature section on all datasets show better result than existing hybrid feature selection framework.So that it is retrieved through the result that proposed feature selection perform better in terms of prediction accuracy than existing feature selection framework.As the existing framework overlooks the prediction model [69],and neglect the optimal features.Hence the correctly classified instances of proposed hybrid feature selection are greater than existing hybrid feature selection framework [39].Fig.6 shows the comparison of thePrediction Accuracyof existing FCBF filter feature selection algorithm [70] with proposed hybrid feature selection framework.The x-axis shows the results on four benchmark datasets and y-axis shows the percentage values of FCBF and proposed feature selection framework on four datasets.Fig.6 shows that the proposed feature selection framework outperforms than FCBF on all selected benchmark datasets.It means that a number of students correctly classified by FCBF on each of the four datasets is much less than the proposed feature selection framework.So that it is retrieved through the result that proposed feature selection perform better in terms of prediction accuracy than FCBF feature selection algorithm.Also,the results reported in the existing literature [39] also shows that the prediction accuracy using FCBF feature selection shows prediction accuracy less than the prediction accuracy of proposed hybrid feature selection framework to predict the performance of students.

    Table 2:Validation of proposed hybrid feature selection framework in terms of prediction accuracy

    4.2 Precision of Proposed Hybrid Feature Selection Framework Using Fusion

    The hybrid feature selection performance is validated through the precision and recall However,to show the classification of minority classes in a prediction model the precision of proposed hybrid feature selection framework is performed.Precision is the fraction of the retrieved instances that belong to the target class.In precision of the proposed feature selection framework gives the ratio of the total number of pass students classified correctly,over the number of students classified as pass.In sum,it shows how accurately the pass,students are identified correctly.Larger the number of pass students correctly classified means that educational stakeholder can build productive academic plans for the improvement in the performance of students.In this section,the precision results of proposed hybrid feature selection framework on four benchmark datasets are compared with exiting feature selection framework,and feature selection algorithms for predicting the performance of students.Referring to Tab.3 shows the comparison of the precision of existing hybrid feature selection framework [39] with the proposed feature selection framework.The red bar shows the proposed framework and black bar shows the existing hybrid feature selection framework in Fig.7.The x-axis shows the results four datasets,and the yaxis shows the percentage value of precession on four datasets.It is clearly observed that the precision of the proposed feature section all datasets show a better result than existing hybrid feature selection framework.Moreover,the number of correctly-classified students by the proposed feature selection framework is greater than existing hybrid feature selection framework.So that it is retrieved through the result that the proposed hybrid feature selection framework performs better in terms of precision than existing feature selection framework.

    Figure 5:Comparing the prediction accuracy of proposed feature selection with FCBF

    Figure 6:Comparing the prediction accuracy hybrid feature of proposed framework with existing work

    Table 3:Validation of proposed hybrid feature selection framework in terms of precision

    Figure 7:Comparing the precision of proposed framework with existing framework

    Fig.8 shows the comparison of the precision of existing FCBF filter feature selection framework with proposed hybrid feature selection framework.The red bar shows the proposed framework and purple bar shows the existing FCBF filter feature selection algorithm in Fig.8.The x-axis shows the results four datasets,and the y-axis shows the percentage values of precision by applying FCBF and proposed feature selection framework on four datasets.The results in Fig.8 depicts that the number of students correctly classified by FCBF algorithm on all selected datasets is much less than the number of correctly classified students by proposed feature selection framework.Hence it is retrieved through the result that proposed feature selection perform better in terms of precision than existing FCBF feature selection algorithm.Fig.9 shows the comparison of the precision of existing IG (Information Gain) filter feature selection algorithm with proposed feature selection framework.The red bar shows the proposed framework and light blue line bar shows the existing IG filter feature selection algorithm in Fig.9.The x-axis shows the results four benchmark datasets,and the y-axis shows the percentage value of f-measure on four datasets.It is clearly observed that the precision of the proposed feature selection on all selected datasets show a better result than existing IG feature selection algorithm.So that it is retrieved through the result that proposed feature selection perform better in terms of precision than existing IG filter feature selection algorithm.

    Figure 8:Comparing precision of proposed hybrid feature selection framework with FCBF

    Figure 9:Comparing precision of proposed hybrid feature selection framework with information gain

    Fig.10 shows the comparison of the precision of existing CFS (Correlation based feature selection) filter feature selection algorithm with proposed hybrid feature selection framework.The x-axis shows the results four datasets,and the y-axis shows the percentage value of precession on four datasets.It is clearly observed that the precession of the proposed feature section on Math,LMS,CS,and PLang datasets show better result than existing CFS feature selection algorithm.So that it is retrieved through the result that proposed feature selection perform better in terms of precision than existing CFS filter feature selection algorithm.

    4.3 Recall of Proposed Hybrid Feature Selection Framework Using Fusion

    The recall is another important measure to evaluate the efficiency of selected features [7].The recall is the fraction of the target class recognized as an actual class [71,72].It gives the ratio of correctly classified students belong to a particular class,over a total number of students.So that the recall results of proposed hybrid feature section framework present the ratio of correctly classified pass students over the total number of students.Henceforth,recall results of the proposed hybrid feature selection framework depicts,the worth of the selected features by proposed hybrid feature selection framework on pass class.In sum,recall gives the percentage that at which extend feature selected by the proposed approach framework can be affected on the performance of students.Tab.4 presents recall results of proposed hybrid feature selection framework,existing feature selection framework,and FCBF,Information gain,CFS,feature selection algorithms on four benchmarking students’datasets (having diversity in number of features,number of instances,and educational domains).Referring to Tab.4,the results in Fig.11 presents the comparison of recall of existing hybrid feature selection framework [39] with the proposed feature selection framework.The x-axis shows the results four datasets,and the y-axis shows the percentage value of precession on four datasets.It is clearly observed that the recall of proposed feature section on all selected datasets show better result than existing hybrid feature selection framework.

    Figure 10:Comparing precision of proposed hybrid feature selection framework with CFS

    Table 4:Validation of proposed hybrid feature selection framework in terms of recall

    Figure 11:Comparing recall of proposed hybrid feature selection framework with existing hybrid feature selection framework

    So that it is retrieved through the result that proposed feature selection perform better in terms of recall than existing feature selection framework.Fig.12 shows the comparison of Recall of existing FCBF filter feature selection framework with proposed hybrid feature selection framework.The x-axis shows the results four datasets,and the y-axis shows the percentage values of precession by applying FCBF and proposed feature selection framework on four datasets.The results shown in Fig.16 depicts that there is greater number of incorrectly classified students than correctly classified.Students for each class (pass,fail) by applying FCBF algorithm on Math,LMS,and PLang datasets.Whereas the results depict that there is a much smaller number of incorrectly classified students for each class on Math,LMS,and PLang datasets by applying proposed feature selection framework.The results also depict that FCBF and proposed feature selection framework show similar results on CS dataset.It means the rate of correctly classified students for a class over the total number of students in a class,the percentage is equally resulted by FCBF and proposed feature selection.However,it is also noticed that CS dataset contains a smaller number of features than other three datasets.Moreover,the recall So that it is retrieved through the result that proposed feature selection perform better in terms of recall than existing FCBF feature selection algorithm.

    Fig.13 shows the comparison of f-measure of existing CFS filter feature selection algorithm with proposed feature selection framework on all datasets.The red bar shows the proposed framework and blue bar shows the existing CFS filter feature selection algorithm in Fig.13(referring to results in Tab.4).The x-axis shows the results four datasets,and the y-axis shows the percentage value of f-measure on four datasets.It is clearly observed that f-measure of proposed feature section on all datasets show a better result than existing CFS feature selection algorithm.So that it is retrieved through the result that proposed feature selection perform better in terms of f-measure than existing CFS filter feature selection algorithm.

    Figure 12:Comparing recall of proposed hybrid feature selection framework with FCBF feature selection

    4.4 F-Measure of Proposed Hybrid Feature Selection Framework Using Fusion

    To evaluate the performance of the proposed hybrid feature selection framework,f-measure results of hybrid feature selection framework on four benchmark datasets are evaluated.F-measure is commonly used in EDM that gives a maximum value,in case,there is a balance between the values of precision and the recall evaluation measures [71].F-measure is the harmonic mean of precision and recall.This measure also conveys the balance between precision and recall evaluation measures.The equation of obtaining f-measure is as follows through equation.In this section the f-measure results of the proposed hybrid feature selection framework are evaluated on four benchmarks students’datasets,and these results are compared with the precision results of existing feature selection framework and feature selection algorithms (like FCB,Information Gain,CFS)on four benchmark students’datasets,in order to validate the proposed hybrid feature selection framework.

    Tab.5 presents f-measure results of proposed hybrid feature selection framework,existing feature selection framework,and FCBF,Information gain,and CFS,feature selection algorithms on four benchmarking students’datasets (having diversity in the number of features,number of instances,and educational domains).Fig.14 shows the comparison of f-measure of existing hybrid feature selection framework with the proposed feature selection framework.

    Table 5:Validation of proposed hybrid feature selection framework in terms of F-Measure

    Figure 14:Comparing F-Measure of proposed hybrid feature selection framework with existing hybrid feature selection framework

    The x-axis shows the results four datasets,and the y-axis shows the percentage value of fmeasure on four datasets.It is clearly observed that the f-measure of proposed feature section on Math,LMS,CS,and PLang datasets show better result than existing hybrid feature selection framework.So that it is retrieved through the result that proposed feature selection perform better in terms of F-measure than existing feature selection framework.Fig.15 shows the comparison of F-Measure of existing FCBF filter feature selection framework with the proposed feature selection framework.The x-axis shows the results four datasets,and the y-axis shows the percentage values of FCBF and proposed feature selection framework on four datasets.Fig.15 depicts that the f-measure results of FCBF on seected datasets are less than the proposed feature selection framework.So that it is retrieved through the result that proposed feature selection perform better in terms of f-measure than FCBF feature selection algorithm.Fig.16 shows the comparison of F-Measure of existing IG filter feature selection framework with the proposed feature selection framework.The x-axis shows the results four datasets,and the y-axis shows the percentage values of IG and proposed feature selection framework on four datasets.

    Figure 15:Comparing F-Measure of proposed hybrid feature selection framework with FCBF feature selection

    Fig.16 depicts that the F-measure results of IG on all datasets are less than the proposed feature selection framework.So that it is retrieved through the result that proposed feature selection perform better in terms of f-measure than IG feature selection algorithm.Fig.17 shows the comparison of f-measure of existing CFS filter feature selection algorithm with proposed feature selection framework.The red bar shows the proposed framework and green line bar shows the existing CFS filter feature selection algorithm in Fig.17.The x-axis shows the results four datasets,and the y-axis shows the percentage value of f-measure on four datasets.It is clearly observed that f-measure of proposed feature section on four benchmark datasets show better result than existing CFS feature selection algorithm.So that it is retrieved through the result that proposed feature selection perform better in terms of f-measure than existing CFS filter feature selection algorithm.Above mentioned results show that proposed hybrid feature selection framework performs better on four benchmark datasets with a varying number of feature and instances,as compared to other feature selection algorithm as well as existing hybrid feature selection in EDM.In sum,the results concluded that the proposed hybrid feature selection outperforms than other existing hybrid feature selection and existing feature selection algorithms.Hence the proposed hybrid feature selection framework is validated.

    Figure 16:Comparing F-Measure of proposed hybrid feature selection framework with IG feature selection

    Figure 17:Comparing F-Measure of proposed hybrid feature selection framework with CFS feature selection

    5 Conclusion

    This research identifies the suitable feature selection algorithms for identification of optimal features for predicting the performance of students.The proposed hybrid feature selection framework overcomes the issues identified in existing hybrid feature selection framework [39],as well as in other hybrid feature selection algorithms [31,33,73].The proposed hybrid feature selection framework contributed to the body knowledge of EDM is such a way that it identifies the optimal features that are significant as well as associated with the target class.The two-level feature fusion added a novel contribution in state-of-the art of students’performance prediction to obtain the optimal selection of features.The proposed hybrid feature selection framework not only identifies the optimal features but also perform better in terms of accuracy,precision,recall,and f-measure than the existing hybrid feature selection framework [39] for predicting the performance of students.Furthermore,the proposed hybrid features selection framework has the ability to perform better on a different number of features and instance.As the proposed hybrid feature selection framework is validated on benchmark datasets with the different number of features,and a different number of instances to show is robustness.Future Directions: In future hybridization of different filter and wrapper feature section will be considered for further accuracy approvement of students’performance prediction model.In future other stakeholders of education like teachers will also be considered for prediction model.

    Funding Statement: The authors received no specific funding for this study.

    Conflicts of Interest: The authors declare that they have no conflicts of interest to report regarding the present study.

    成人国产综合亚洲| 国产视频一区二区在线看| 网址你懂的国产日韩在线| 日韩国内少妇激情av| 亚洲国产欧美网| 在线a可以看的网站| 大型黄色视频在线免费观看| 男女下面进入的视频免费午夜| 国产精品av视频在线免费观看| 床上黄色一级片| 69av精品久久久久久| 国产亚洲精品久久久com| 亚洲成a人片在线一区二区| 日韩欧美国产在线观看| 国产午夜精品论理片| 一级毛片精品| 麻豆成人av在线观看| 嫩草影视91久久| 日韩欧美一区二区三区在线观看| 天天躁狠狠躁夜夜躁狠狠躁| 19禁男女啪啪无遮挡网站| 亚洲国产精品成人综合色| 香蕉国产在线看| 特大巨黑吊av在线直播| 欧美高清成人免费视频www| 日日夜夜操网爽| 免费在线观看视频国产中文字幕亚洲| 男人舔奶头视频| 97人妻精品一区二区三区麻豆| 亚洲国产欧洲综合997久久,| 亚洲精华国产精华精| 国产视频内射| 精品国内亚洲2022精品成人| 国产精品1区2区在线观看.| 老熟妇仑乱视频hdxx| 久久精品aⅴ一区二区三区四区| 国产又色又爽无遮挡免费看| 19禁男女啪啪无遮挡网站| 免费人成视频x8x8入口观看| 在线看三级毛片| 国产久久久一区二区三区| 免费在线观看日本一区| 网址你懂的国产日韩在线| 99热这里只有是精品50| 精品一区二区三区av网在线观看| 99热精品在线国产| 两性午夜刺激爽爽歪歪视频在线观看| 麻豆久久精品国产亚洲av| 美女大奶头视频| 国产一区二区在线av高清观看| 老鸭窝网址在线观看| 亚洲国产色片| 中文在线观看免费www的网站| 国产成+人综合+亚洲专区| 欧美三级亚洲精品| 国产精品久久久人人做人人爽| 国产精品精品国产色婷婷| 69av精品久久久久久| 99久久精品国产亚洲精品| 亚洲欧美日韩高清在线视频| 夜夜看夜夜爽夜夜摸| 最新中文字幕久久久久 | 免费高清视频大片| 色精品久久人妻99蜜桃| 亚洲在线自拍视频| 观看美女的网站| 又大又爽又粗| 1024手机看黄色片| www.精华液| 99国产精品99久久久久| 特级一级黄色大片| 中文亚洲av片在线观看爽| 人人妻人人澡欧美一区二区| 男人舔女人下体高潮全视频| 黄片小视频在线播放| 97超视频在线观看视频| 国产一区二区在线av高清观看| 91久久精品国产一区二区成人 | 久久这里只有精品中国| 久久精品影院6| 欧美在线一区亚洲| 欧美日韩福利视频一区二区| 国产久久久一区二区三区| 天堂av国产一区二区熟女人妻| 三级男女做爰猛烈吃奶摸视频| ponron亚洲| 久久亚洲精品不卡| 琪琪午夜伦伦电影理论片6080| 亚洲精华国产精华精| 99久久国产精品久久久| 欧洲精品卡2卡3卡4卡5卡区| 99热精品在线国产| 精品福利观看| 国产精品久久久久久亚洲av鲁大| 国产精品一及| 成在线人永久免费视频| 最近视频中文字幕2019在线8| 午夜免费成人在线视频| 99re在线观看精品视频| 亚洲无线在线观看| 狠狠狠狠99中文字幕| 可以在线观看的亚洲视频| 熟女少妇亚洲综合色aaa.| 美女被艹到高潮喷水动态| 国产亚洲精品av在线| 成年版毛片免费区| 日韩人妻高清精品专区| 久久精品aⅴ一区二区三区四区| 国产精品精品国产色婷婷| 精品欧美国产一区二区三| 精品99又大又爽又粗少妇毛片 | 偷拍熟女少妇极品色| 琪琪午夜伦伦电影理论片6080| 免费高清视频大片| 国产欧美日韩精品亚洲av| 一a级毛片在线观看| 国内少妇人妻偷人精品xxx网站 | 麻豆国产av国片精品| 亚洲av成人精品一区久久| 成人精品一区二区免费| 色哟哟哟哟哟哟| 国产私拍福利视频在线观看| 欧美zozozo另类| 人妻久久中文字幕网| 两性午夜刺激爽爽歪歪视频在线观看| 999久久久国产精品视频| 在线免费观看的www视频| 亚洲精品色激情综合| 国产高清激情床上av| 亚洲真实伦在线观看| 日本黄大片高清| 亚洲欧美精品综合一区二区三区| 观看免费一级毛片| 男人的好看免费观看在线视频| 亚洲性夜色夜夜综合| 中文字幕人妻丝袜一区二区| 村上凉子中文字幕在线| 成人av一区二区三区在线看| 又紧又爽又黄一区二区| 日韩欧美在线乱码| 免费在线观看日本一区| 国产av一区在线观看免费| 桃红色精品国产亚洲av| 国产一区在线观看成人免费| 亚洲中文字幕一区二区三区有码在线看 | 久久午夜亚洲精品久久| 久久久久久大精品| 久久精品人妻少妇| 757午夜福利合集在线观看| 夜夜夜夜夜久久久久| 色播亚洲综合网| 一个人免费在线观看的高清视频| 久久精品国产清高在天天线| 日韩成人在线观看一区二区三区| 亚洲成人久久性| h日本视频在线播放| 亚洲成av人片在线播放无| 伦理电影免费视频| 欧美3d第一页| 久久久国产成人免费| 99国产精品99久久久久| 国产精品亚洲一级av第二区| 国产乱人伦免费视频| netflix在线观看网站| 夜夜躁狠狠躁天天躁| 蜜桃久久精品国产亚洲av| 欧美精品啪啪一区二区三区| 国产av在哪里看| 18禁国产床啪视频网站| 一个人免费在线观看的高清视频| 午夜精品久久久久久毛片777| 熟女人妻精品中文字幕| 两性夫妻黄色片| 久久久久国内视频| 欧美成人性av电影在线观看| 国产精品精品国产色婷婷| 天堂√8在线中文| 免费av毛片视频| 男人舔女人下体高潮全视频| 精品久久久久久,| 国产精品影院久久| 久久久久久九九精品二区国产| 亚洲人成网站在线播放欧美日韩| 欧美日韩瑟瑟在线播放| 欧美乱色亚洲激情| 欧美高清成人免费视频www| 国产精品电影一区二区三区| 最近最新中文字幕大全电影3| 亚洲精品乱码久久久v下载方式 | 18禁黄网站禁片午夜丰满| 亚洲欧美日韩高清专用| 亚洲五月婷婷丁香| 亚洲精华国产精华精| 国产精品久久久久久人妻精品电影| 亚洲av电影在线进入| 国产亚洲欧美在线一区二区| 中文字幕人妻丝袜一区二区| 国产精品一及| 国产精品,欧美在线| 日韩 欧美 亚洲 中文字幕| 伊人久久大香线蕉亚洲五| 国产精品综合久久久久久久免费| 久久精品91蜜桃| 成人国产综合亚洲| 757午夜福利合集在线观看| 亚洲自偷自拍图片 自拍| 国产伦在线观看视频一区| 国产激情偷乱视频一区二区| 国产精品一区二区免费欧美| av黄色大香蕉| 在线免费观看的www视频| 美女午夜性视频免费| 亚洲自拍偷在线| 成人三级黄色视频| 亚洲成人精品中文字幕电影| 久久久国产欧美日韩av| 亚洲美女黄片视频| 少妇人妻一区二区三区视频| 亚洲乱码一区二区免费版| 亚洲精品在线观看二区| 久久久久亚洲av毛片大全| 久久久久亚洲av毛片大全| 女人高潮潮喷娇喘18禁视频| av天堂在线播放| 日韩欧美精品v在线| 国产成人系列免费观看| 日韩欧美精品v在线| 欧美日本亚洲视频在线播放| 此物有八面人人有两片| 此物有八面人人有两片| 一进一出抽搐gif免费好疼| 悠悠久久av| 国产爱豆传媒在线观看| 久久久久久人人人人人| 999久久久国产精品视频| 亚洲av成人不卡在线观看播放网| 久久久久久九九精品二区国产| 国产爱豆传媒在线观看| 国产三级中文精品| av中文乱码字幕在线| 午夜免费成人在线视频| 精品久久久久久,| 草草在线视频免费看| 久久久久久久久久黄片| 国产亚洲欧美在线一区二区| 欧美日韩乱码在线| 一个人观看的视频www高清免费观看 | 久久香蕉国产精品| 国产精品av久久久久免费| 亚洲av熟女| 性色avwww在线观看| 国产精品美女特级片免费视频播放器 | 亚洲国产中文字幕在线视频| 日本一本二区三区精品| 搞女人的毛片| 在线观看66精品国产| 国产爱豆传媒在线观看| 丁香欧美五月| 18禁国产床啪视频网站| 国产三级黄色录像| 黄频高清免费视频| 成人无遮挡网站| 成人性生交大片免费视频hd| 真实男女啪啪啪动态图| 久久国产精品影院| 国产真人三级小视频在线观看| 免费av毛片视频| 国产亚洲欧美在线一区二区| 岛国在线免费视频观看| 美女cb高潮喷水在线观看 | 国产视频一区二区在线看| 999久久久国产精品视频| 国产不卡一卡二| 国产av不卡久久| 欧美日韩乱码在线| 欧美乱妇无乱码| 免费看日本二区| 精品久久久久久成人av| 免费在线观看日本一区| 国产精品乱码一区二三区的特点| 99视频精品全部免费 在线 | 久久久色成人| 亚洲人成伊人成综合网2020| 久久久久精品国产欧美久久久| 丁香六月欧美| 国产精品99久久99久久久不卡| 999久久久国产精品视频| 亚洲片人在线观看| 九色成人免费人妻av| 亚洲aⅴ乱码一区二区在线播放| 毛片女人毛片| 欧美丝袜亚洲另类 | 午夜成年电影在线免费观看| 亚洲 欧美一区二区三区| 曰老女人黄片| 日韩精品青青久久久久久| 国产精品av视频在线免费观看| 久久久久久国产a免费观看| 啪啪无遮挡十八禁网站| 亚洲成人久久爱视频| 欧美另类亚洲清纯唯美| 国产探花在线观看一区二区| 亚洲国产精品合色在线| 久久久久久九九精品二区国产| 久久这里只有精品中国| 亚洲狠狠婷婷综合久久图片| 嫩草影院入口| 97超级碰碰碰精品色视频在线观看| 日本在线视频免费播放| 99久久99久久久精品蜜桃| 国产精品亚洲一级av第二区| 色在线成人网| 老汉色av国产亚洲站长工具| 欧美zozozo另类| 国内精品久久久久久久电影| 久久国产精品人妻蜜桃| 国产免费男女视频| ponron亚洲| 国产91精品成人一区二区三区| 性色av乱码一区二区三区2| 久久久国产成人免费| 国内久久婷婷六月综合欲色啪| 韩国av一区二区三区四区| 精品国产亚洲在线| 国内揄拍国产精品人妻在线| 欧美性猛交黑人性爽| 男插女下体视频免费在线播放| 成年女人毛片免费观看观看9| 啦啦啦韩国在线观看视频| 老司机在亚洲福利影院| 人人妻人人澡欧美一区二区| 天堂√8在线中文| 成人无遮挡网站| 99久国产av精品| 亚洲av片天天在线观看| 两个人的视频大全免费| 国内揄拍国产精品人妻在线| 久久中文字幕人妻熟女| 别揉我奶头~嗯~啊~动态视频| 丰满人妻熟妇乱又伦精品不卡| 欧美乱色亚洲激情| 欧美+亚洲+日韩+国产| 欧美黑人欧美精品刺激| 日韩av在线大香蕉| 在线免费观看不下载黄p国产 | 18禁美女被吸乳视频| 久99久视频精品免费| 欧美色视频一区免费| 欧美成人免费av一区二区三区| 午夜日韩欧美国产| 日本黄大片高清| 精品免费久久久久久久清纯| 高清在线国产一区| 国产三级中文精品| 亚洲黑人精品在线| 国产伦精品一区二区三区四那| 天天躁狠狠躁夜夜躁狠狠躁| 母亲3免费完整高清在线观看| 免费在线观看日本一区| 午夜视频精品福利| 变态另类成人亚洲欧美熟女| or卡值多少钱| 亚洲在线自拍视频| 最好的美女福利视频网| 99国产精品一区二区蜜桃av| 怎么达到女性高潮| 在线观看一区二区三区| 国内精品久久久久久久电影| 亚洲一区二区三区色噜噜| 18美女黄网站色大片免费观看| 国产激情欧美一区二区| 身体一侧抽搐| 日本 欧美在线| 亚洲av成人一区二区三| 日韩欧美一区二区三区在线观看| 少妇的丰满在线观看| 久久久久久九九精品二区国产| 中国美女看黄片| 中文资源天堂在线| 在线永久观看黄色视频| 毛片女人毛片| 日本黄色片子视频| 91av网站免费观看| 性色avwww在线观看| 日本在线视频免费播放| 一个人看的www免费观看视频| 美女大奶头视频| 观看美女的网站| 国产视频内射| 搡老熟女国产l中国老女人| 人人妻,人人澡人人爽秒播| 日韩三级视频一区二区三区| 一本一本综合久久| 国产精品一区二区免费欧美| 一区二区三区激情视频| 黄片小视频在线播放| 国产成人影院久久av| 中文在线观看免费www的网站| 在线观看免费视频日本深夜| 精品国产美女av久久久久小说| 亚洲精华国产精华精| 国产久久久一区二区三区| 国语自产精品视频在线第100页| 国产主播在线观看一区二区| 99久久无色码亚洲精品果冻| 亚洲av电影不卡..在线观看| 亚洲色图av天堂| 国产伦精品一区二区三区视频9 | 国产精品1区2区在线观看.| 久久婷婷人人爽人人干人人爱| 母亲3免费完整高清在线观看| 欧洲精品卡2卡3卡4卡5卡区| 国产乱人伦免费视频| 九九久久精品国产亚洲av麻豆 | 老司机福利观看| 激情在线观看视频在线高清| 国产伦精品一区二区三区视频9 | 色噜噜av男人的天堂激情| 夜夜爽天天搞| 国产高清videossex| 亚洲精品在线美女| 19禁男女啪啪无遮挡网站| 可以在线观看的亚洲视频| 亚洲成av人片在线播放无| 无遮挡黄片免费观看| 两个人视频免费观看高清| 欧洲精品卡2卡3卡4卡5卡区| 国产亚洲精品久久久久久毛片| 精品乱码久久久久久99久播| 两性夫妻黄色片| 天堂av国产一区二区熟女人妻| 婷婷精品国产亚洲av在线| 国产黄色小视频在线观看| 99热这里只有精品一区 | 成人三级黄色视频| 麻豆av在线久日| 窝窝影院91人妻| 免费看a级黄色片| 一个人观看的视频www高清免费观看 | 很黄的视频免费| 亚洲av日韩精品久久久久久密| 久久这里只有精品19| 国产成+人综合+亚洲专区| 国产精品香港三级国产av潘金莲| 久久久久国内视频| 亚洲 欧美一区二区三区| 国产av在哪里看| 欧美激情在线99| 久久国产精品人妻蜜桃| www.自偷自拍.com| 我要搜黄色片| 国产激情久久老熟女| 日本黄色视频三级网站网址| 欧美一区二区国产精品久久精品| 老司机午夜十八禁免费视频| 一个人看视频在线观看www免费 | 美女大奶头视频| 性色avwww在线观看| 免费观看精品视频网站| 欧美日本亚洲视频在线播放| 亚洲黑人精品在线| 精品国产乱子伦一区二区三区| 精品人妻1区二区| 亚洲欧美日韩卡通动漫| 九色国产91popny在线| 国内少妇人妻偷人精品xxx网站 | 国产精品野战在线观看| 一级a爱片免费观看的视频| a在线观看视频网站| 少妇的逼水好多| 男人舔女人下体高潮全视频| 好男人在线观看高清免费视频| 91九色精品人成在线观看| 久久精品国产亚洲av香蕉五月| 久久久国产成人免费| 亚洲自偷自拍图片 自拍| 香蕉av资源在线| 国产麻豆成人av免费视频| 91九色精品人成在线观看| 美女免费视频网站| 青草久久国产| 亚洲国产欧美网| 他把我摸到了高潮在线观看| 午夜激情福利司机影院| 亚洲精品一卡2卡三卡4卡5卡| 国产精品日韩av在线免费观看| 精品99又大又爽又粗少妇毛片 | 国产乱人视频| 国产精品久久久久久人妻精品电影| 法律面前人人平等表现在哪些方面| 1000部很黄的大片| 欧美成人性av电影在线观看| 日韩大尺度精品在线看网址| 好男人电影高清在线观看| cao死你这个sao货| 国产精品电影一区二区三区| 国产精品香港三级国产av潘金莲| 最近在线观看免费完整版| 不卡av一区二区三区| 一二三四社区在线视频社区8| 99热6这里只有精品| 又紧又爽又黄一区二区| 久久久国产精品麻豆| 亚洲第一电影网av| 最好的美女福利视频网| 一本久久中文字幕| 亚洲自偷自拍图片 自拍| 欧美黄色片欧美黄色片| 中文字幕精品亚洲无线码一区| 色老头精品视频在线观看| 长腿黑丝高跟| 亚洲精品乱码久久久v下载方式 | 手机成人av网站| 久久久水蜜桃国产精品网| 欧美另类亚洲清纯唯美| 午夜福利高清视频| 天天躁日日操中文字幕| 成人精品一区二区免费| 久久天堂一区二区三区四区| 色视频www国产| 999精品在线视频| 免费人成视频x8x8入口观看| 亚洲av成人精品一区久久| 国产精品久久久av美女十八| 成人无遮挡网站| 99国产精品一区二区三区| 久久久久久久午夜电影| 日韩欧美在线二视频| 2021天堂中文幕一二区在线观| 亚洲成av人片免费观看| 激情在线观看视频在线高清| 精品福利观看| 女生性感内裤真人,穿戴方法视频| 精品久久久久久久久久久久久| 久久久国产欧美日韩av| 别揉我奶头~嗯~啊~动态视频| 黄色成人免费大全| 亚洲欧美一区二区三区黑人| 一级毛片高清免费大全| 在线观看一区二区三区| 成人18禁在线播放| av国产免费在线观看| 国产成人av教育| 国产一区二区三区视频了| 国产精品久久久久久人妻精品电影| 亚洲成av人片在线播放无| 国产精品一区二区三区四区久久| 校园春色视频在线观看| av中文乱码字幕在线| 成人av在线播放网站| 日本黄色视频三级网站网址| 国产精品久久久久久人妻精品电影| 在线观看舔阴道视频| 九色成人免费人妻av| 国产 一区 欧美 日韩| 黄色女人牲交| 午夜福利成人在线免费观看| 欧美日本亚洲视频在线播放| 18禁黄网站禁片午夜丰满| 午夜福利在线观看免费完整高清在 | 搡老岳熟女国产| 国产亚洲欧美98| 91字幕亚洲| 亚洲国产色片| 999精品在线视频| 午夜免费激情av| 黄色日韩在线| 啪啪无遮挡十八禁网站| 不卡av一区二区三区| 欧美性猛交╳xxx乱大交人| 麻豆一二三区av精品| 亚洲av成人av| 久久久精品大字幕| 久久天堂一区二区三区四区| 熟女少妇亚洲综合色aaa.| 18禁黄网站禁片免费观看直播| 欧美3d第一页| 国产野战对白在线观看| 精品欧美国产一区二区三| 全区人妻精品视频| 在线观看日韩欧美| 午夜免费成人在线视频| 99久国产av精品| 精品国产美女av久久久久小说| 久久午夜综合久久蜜桃| 国产一区二区在线av高清观看| 91九色精品人成在线观看| 亚洲黑人精品在线| 999久久久精品免费观看国产| 狂野欧美白嫩少妇大欣赏| 日韩成人在线观看一区二区三区| 少妇的丰满在线观看| 久久精品aⅴ一区二区三区四区| 制服人妻中文乱码| 丰满人妻一区二区三区视频av | 亚洲狠狠婷婷综合久久图片| 看黄色毛片网站| 欧美性猛交黑人性爽| 丁香六月欧美| 成人特级av手机在线观看| 看黄色毛片网站| 国产精品自产拍在线观看55亚洲| 黑人操中国人逼视频| 99国产综合亚洲精品| 国产精品亚洲美女久久久| 精品免费久久久久久久清纯| www日本在线高清视频| 久久久久久久精品吃奶| 国产黄片美女视频| 久久天躁狠狠躁夜夜2o2o| 欧美日韩综合久久久久久 | 日韩欧美三级三区| 精品国产乱子伦一区二区三区| 一二三四社区在线视频社区8| 91老司机精品|