• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A New Fuzzy Adaptive Algorithm to Classify Imbalanced Data

    2022-11-09 08:13:04HarshitaPatelDharmendraSinghRajputOvidiuPetruStanandLiviuCristianMiclea
    Computers Materials&Continua 2022年1期

    Harshita Patel,Dharmendra Singh Rajput,*,Ovidiu Petru Stan and Liviu Cristian Miclea

    1School of Information Technology&Engineering,Vellore Institute of Technology,Vellore,632014,India

    2Technical University of Cluj Napoca,Faculty of Automation and Computer Science,Cluj Napoca,400114,Romania

    Abstract:Classification of imbalanced data is a well explored issue in the data mining and machine learning community where one class representation is overwhelmed by other classes.The Imbalanced distribution of data is a natural occurrence in real world datasets,so needed to be dealt with carefully to get importantinsights.In case of imbalance in data sets,traditional classifiers have to sacrifice their performances,therefore lead to misclassifications.This paper suggests a weighted nearest neighbor approach in a fuzzy manner to deal with this issue.We have adapted the ‘existing algorithm modification solution’to learn from imbalanced datasets that classify data without manipulating the natural distribution of data unlike the other popular data balancing methods.The K nearest neighbor is a non-parametric classification method that is mostly used in machine learning problems.Fuzzy classification with the nearest neighbor clears the belonging of an instance to classes and optimal weights with improved nearest neighbor concept helping to correctly classify imbalanced data.The proposed hybrid approach takes care of imbalance nature of data and reduces the inaccuracies appear in applications of original and traditional classifiers.Results show that it performs well over the existing fuzzy nearest neighbor and weighted neighbor strategies for imbalanced learning.

    Keywords: Machine learning;fuzzy classification;nearest neighbor;adaptive approach;optimal weights

    1 Introduction

    The last few decades have borne witness to various developments in science and technology.These developments have empowered the generation of enormous amounts of data and opportunities for mining useful information from this data and other activities of data science.It can already be seen applied in various applications of data mining [1,2].In such data mining applications many challenges occur at different levels.Classification of imbalanced data is one of the important and frequently occurring challenges of data mining.In general,imbalance refers to the unequal distribution of data into classes where a large number of data instances belong to one class while a small number of examples represent other classes,known as majority and minority classes.Resultantly,the classifier’s accuracy will be biased towards the majority class and minority class instances will be misclassified.This is happening because traditional classifiers are considering the balanced distribution of data in general.Various types of imbalance are ‘between class imbalance,’‘intrinsic/extrinsic,’‘relative,’‘a(chǎn)bsolute rarity’and ‘within class imbalance’etc.[3-5].Classification of imbalanced data is considered as one of the top ten challenging issues of data mining [6]and researchers include it in new and frequently explored trends of data mining [7,8].It can become very hazardous as an imbalance in data appears in many real world applications such as Medical Diagnosis [9,10],Oil-spill Detection [11],Credit Card Fraud Detection [12],Culture Modeling [13],Network Intrusion,Text Categorization,Helicopter Gearbox Fault Monitoring,remote sense classification for land mine detection [14] etc.These are some examples that show the need for special treatment being required for datasets with imbalance.

    Four known ways to deal with imbalance are (i) balancing datasets by resampling techniques,(ii) modification in tradition classification algorithms,(iii) cost-sensitive techniques and (iv) ensemble approaches.In this paper we focus on the second approach,of modifying the traditional classifier.We proposed an improved nearest neighbor approach to learn from imbalanced data with fuzzy logic.

    The nearest neighbor classifier is a significant instance-based learning approach where prior information about data is not available and the sample size is insufficient to represent.The classifier is not prepared in advance for the nearest neighbor and the class label is assigned on the basis of the higher number of the nearest neighbors of the test data.It is one of the best known and most important algorithms of data mining [15].K-nearest neighbor is characterized by its simplicity,programmability,comprehensibility,and robustness.Its error rate is bounded above by twice the Bayes error rate [16,17].Its improved versions and weighted variants are proposed to solve different issues such as but not limited to,imbalanced data.Alliance with fuzzy further improved the performance of nearest neighbor and with weights deals some more issues.Fuzzy logic,unlike crisp concept in classification,looks for memberships of data instances into classes instead of their complete belonging.Though fuzzy logic provides complementary solutions that are not competitive with crisp,it helps in obtaining a better classification.Fuzzy K nearest neighbor and weighted nearest neighbors may deal with imbalance when they are specifically designed for such cases.In the past many fuzzy rule-based algorithms and other combinations with nearest neighbor algorithms have been proposed to tackle the imbalance issue.Optimally weighted fuzzy nearest neighbor is the most trusted due to its low bias,its weighing is kriging based,which is also known as the best linear estimator [18].

    · This paper adopts an algorithm modification strategy for learning.

    · Optimal weights and adaptive approach [19] are merged with fuzzy nearest neighbor [20]and it results in better classification performance for imbalanced data.

    · This research work is an advanced version of [21] with detailed experimental studies and assessment of significance.

    The paper is organized as follows: Section 2 contains a related literature review for the proposed algorithm.Section 3 offers brief details of basic techniques for our work.Section 4 describes the steps of the proposed methodology of improved fuzzy weighted nearest neighbor approach for imbalanced data.Section 5 discusses the way in which our algorithm works.Experiments and results are being discussed in Section 6 followed by significance testing in Section 7.The conclusion of the work and future possibilities marked as ‘conclusions’and kept in Section 8.

    2 Related Works

    This section discusses relevant modified nearest neighbor approaches to deal with imbalanced data.To evaluate the performances of classifiers for different degrees of class imbalance,Prati et al.[22] designed an experimental setup.Also,a confidence interval-based procedure was proposed to examine the performance statistics of classifiers in this setup.It was discovered that misclassification is proportional to the degree of imbalance,i.e.,higher imbalance results in higher loss and vice versa;existing solutions are partially dealing with the issue.López et al.[23]have performed two fold studies on imbalanced learning.In the first one they are examining the pre-processing with data balancing techniques,cost sensitive and ensemble techniques on the experimental background.In the second fold of the study authors have discussed the significance of inherent data characteristics,such as size or density of the sample,the possibility of classes overlapping,presence of noise etc.

    A good number of crisp and fuzzy nearest neighbor approaches have been proposed to improve the classification of imbalanced data.Kriminger et al.[24] proposed a single class algorithm entitled Class Conditional Nearest Neighbor Distribution (CCNND) to minimize the consequences of imbalance by applying a local geometric structure in data.Toma?ev et al.[25],considers that the high misclassification rate is due to minority class examples.The classification is different with low and medium dimensional datasets where majority class examples are responsible for misclassification.An Instance Hybrid Selection using Nearest Neighbor (HISNN) is proposed by Ryu et al.[26] for Cross-Project Defect Prediction (CPDP).In such cases,the class imbalance is presented in distributions of source and target projects.In this approach,the K-nearest neighbor algorithm is used to learn local information,while global information is learned by naive Bayes.This hybrid approach is yielding high performance in software defect prediction.

    Some notable contribution with weighing strategy is given by the community.Dubey et al.proposed a modified class-based weighted nearest neighbor algorithm for imbalanced data.Weights are calculated on the basis of the distribution of nearest neighbors of test instances for the traditional k-nearest neighbor approach [27].A hybrid neighbor weighted approach is proposed by Patel et al.[28] to improve imbalance learning using the nearest neighbor policy;large and small weights for small and large classes are improved with different values of K for different classes,according to their sizes.Ando [29] proposed another class-wise weighted nearest neighbor classification model in which convex optimization technique was used to learn weights with a powerful mathematical model to maximize nonlinear performance measure for training data.An improved weighted nearest neighbor approach with class confidence weights was proposed by Liu et al.[30].This approach takes attribute probabilities to weight prototypes and to get posterior probabilities.Class confidence weights were calculated using Mixture Modelling and Bayesian Networks.

    Not a lot of work has been done on fuzzy K-nearest neighbor approaches for imbalanced data.A fuzzy-rough ordered weighted average nearest neighbor approach is proposed by Ramentol et al.[31] for binary class imbalance using six weight vectors.They also proposed indiscernibility relations in combination with these weight vectors.Fernández et al.[32] have performed analysis on fuzzy rule based classification systems for imbalanced data sets.Adaptive parametric conjunction operators are applied for better classification results for varying imbalanced ratios.Han et al.[33] proposed fuzzy and rough properties based nearest neighbors approach that minimizes the majority class generated bias.They also defined a membership function to provide advantages to minority class examples.A coupled fuzzy K-nearest neighbor approach for categorical data is proposed by Liu et al.[34] where data instances are unequally distributed and retain bonds among attributes,classes and other instances.Assignment of sized membership,similarity calculation and integration are the key functions of this approach.Patel et al.[35] have proposed a fuzzy nearest neighbor method in an adaptive way to deal with class imbalance issue with varying K values proportional to the size of the classes;this fuzzy-adaptive K concept is dealing well against the bias of the traditional nearest neighbor classifier.

    3 Preliminaries

    The fundamentals of the K-nearest neighbor algorithm,fuzzy K-nearest neighbor algorithm,adaptive approach and optimally weighted fuzzy KNN are provided in this section.These details make our proposed approach easy to learn.We consider default Euclidian distance as the distance measure to find nearest neighbors of data instances.The following subsections explain the details of all these approaches with their mathematical formulation.

    3.1 K-Nearest Neighbor Algorithm

    We know that for the K nearest neighbor algorithm training set is kept until the classification process is completed and no classifier is prepared in advance.Consider any query instance ofq,given the class label of theqalgorithm to find the K nearest neighbors ofqfrom the training set where K is any integer value.The concept of KNN says that the class label would be assigned to the query instance from which it will have more nearest neighbors.The mathematical assumption of KNN could be understood with the following equation:

    Here C(q)=class label of q,to be predicted,

    m=Number of classes,

    Q(q,K)=Set of K-nearest neighbors of q and

    S(xj,C)=

    3.2 Fuzzy K-Nearest Neighbor Algorithm

    Unlike their crisp counterparts the fuzzy theK-nearest neighbor algorithm finds memberships of data instances into classes instead of looking for complete belonging.It is encouraging to priorly known that for unlabeled query instance that its neighbors belong to a class of more accurate classification.

    Equations are given by Keller et al.(1985) for fuzzy memberships of training instances into classes

    Here nC=nearest neighbors of x from class C

    μC(x)=Membership of xinto class C

    And for memberships of test instanceq

    where p is an integer and p >1

    And qi nearest neighbour of q,(i=1,...,K)

    3.3 Adaptive Approach

    Baoli et al.(2004) proposed an adaptive method for imbalanced text data categorization with the concept of different K for different classes i.e.,large K for large classes and small K for small classes.They suggested a way to calculate the value of K (it will be calledKCmfor this particular class) with respect to class size by the following equation

    Here K=Original input integer to define nearest neighbors,

    KCm=Calculated K for each class C using above formula,

    I(Cm)=Number of instances in class Cm where m=1and2,

    λ=Constant Integer value.

    3.4 Optimally Weighted Fuzzy K-Nearest Neighbor Algorithm

    Optimally Weighted Fuzzy K-Nearest Neighbor Algorithm is given by Pham.These optimal weights are based on the kriging concept.In this approach,the K nearest neighbor is first found for query instanceqtraditionally and then the calculation of optimal weights is done to find membership ofqshown by the following equation:

    w=set of weights

    Cq=Covariance matrix between nearest neighbors of q

    Cqx=Covariance matrix between q and its nearest neighbors

    Now fuzzy membership is assigned toqfor class Ciwith

    Here

    xj=set of nearest neighbors

    wj=set of optimal weights for (j=1,2,...,K),

    and

    This method may result in negative weights that could be converted to positive values by the following given formula:

    wnew=

    where γ=-minwm

    4 Proposed Methodology

    The proposed algorithm unites the properties of fuzzy nearest neighbor classification,optimal weights,and adaptive approach to classify imbalanced data.The fuzzy nearest neighbor finds out memberships of test instants into classes instead of their complete belonging in one class.These memberships strengthened by optimal weights.The adaptive approach finds differentKfor different classes with respect to their sizes which,leads to reduce misclassification of imbalanced data.

    Proposed Algorithm:

    Step 1.FindKCifor each class of training data using

    Step 2.Find memberships of training data into each class using

    Let a training instancev∈Ci,Then

    While taking∑μCi(v)=1

    Step 3.For test instanceu,find a set of nearest neighbors X for anyK

    Where X=(x1,x2,...,xn),forK=n (some integer)

    Step 4.Get covariance matrixCubetween nearest neighbors ofu

    Step 5.Get covariance matrixCuxbetweenuand its nearest neighbors

    Step 6.Calculate weight matrix using

    W=Cu-1Cux

    Step 7.Normalize negative weights to positive

    Step 8.Find membership of test instanceuusing

    Step 9.Assign class label to test instanceuby

    5 Algorithm Discussions

    The first step of the proposed algorithm is to find out the values ofKin terms ofKCifor different classes using givenK,the number of instances in each class and a parameterλwhich is equal to the one used to avoid the result being very small.The second step estimates the memberships of instances from training data into binary classes as this equation intended to find membership in two classes only.The need is to find membership of instance into one class and in the other class it could be retrieved by letting the sum of memberships be one.The third step finds out the set of nearest neighbors of query instances forK.Step four evaluates covariance between nearest neighbors of query instances and step five finds out the covariance between query instance and its nearest neighbors.Next,the sixth step calculates the weight using both covariance matrices and negative weights that are normalized in step seven to retain the robustness.Step eight finds out the membership of test instances with the help of memberships of nearest neighbors from training data and weights found in previous steps.Assignment of the class label is done in the last step i.e.,Class label is assigned to the test instance for the class having a higher membership value.

    6 Experiments&Results

    To judge the performance,experimental analysis is done between our proposed method weighted fuzzy K nearest neighbor algorithm using adaptive approach (WFAKNN),neighbor weighted K nearest neighbor (NWKNN) [36],hybrid weighted nearest neighbor approach (Adpt-NWKNN),and fuzzy neighbor weighted approach (Fuzzy-NWKNN) [37].All these algorithms are taken from a similar background of weights and fuzzy aggregation.Eight datasets with different imbalance ratios are taken from UCI [38] and KEEL [39] repositories for binary classification with full feature space.All experiments took place in MATLAB platform.

    6.1 Datasets

    All the eight numerical datasets are taken from UCI and KEEL repositories to judge the performance of the proposed algorithm with different imbalance ratios.

    Ionosphere:This is radar signals data taken from the UCI repository,it is a collection of 351 instances for 34 attributes and a class attribute.It is a binary class dataset;classes are ‘Good’and‘Bad.’‘Good’class instances are radar’s returned signals representing the state of free electrons resulting in any possible structure in the ionosphere while ‘Bad’signals are passing through the layers directly.The ‘Good’class is the majority class with 225 instances while the ‘Bad’class is representing minority instances,totalling 126.The imbalance ratio between the minority and the majority classes is 1.79.

    Glass0:Originally Glass Identification dataset with 214 instances and 9 attributes was used for the identification of the glass used in a crime.These are seven sources,originally representing the seven classes.KEEL repository is providing pre-processed versions of this dataset for a better understanding of imbalanced classification.Glass0 is one binary class version of such a concept taken from all the 214 instances with two classes ‘Negative’and ‘Positive’.The ‘Negative’is representing the majority class with 144 instances and the ‘Positive’class is having 70 minority instances.The imbalance ratio for these two classes is 2.05.

    Vertebral:Vertebral data set is taken from the UCI repository,it is an orthopaedic dataset of 310 instances,100 of which are normal and 210 are abnormally categorized,hence ‘Normal’is minority class and ‘Abnormal’is majority class.Vertebral is having 6 attributes,and the dataset is showing an imbalance ratio value of 2.1.

    Vehicle0: This pre-processed dataset has been taken from the KEEL repository,having 846 instances,18 attributes,two classes i.e.,‘Positive’and ‘Negative’and 3.25 as the imbalance ratio.The dataset was originally featured for the identification of 3D objects from 2D images and having four classes of vehicles,converted into two class data for learning.

    Ecoli1:The Ecoli1 is also a specific imbalance representing pre-processed dataset taken from the KEEL repository with 336 instances and an imbalance ratio of 3.36.Ecoli is a type of bacteria that resides in human or animal’s intestine and generally doesn’t do harm.However,in some cases it may cause diarrhea and other abdominal problems.This dataset is having 7 biological attributes and a class attribute.Ecoli1 is considered a binary class dataset,i.e.,minority class ‘Positive’with 77 instances and majority class ‘Negative’with 259 instances.

    Spectfheart:It is a binary dataset having 267 instances for 44 attributes representing cardiac Single Proton Emission Computed Tomography (SPECT) images.267 patients are categorized into two classes;normal (0) and abnormal (1).55 normal and 212 abnormal images are representing the minority and majority classes respectively and the imbalance ratio is 3.85.The dataset is taken from the UCI repository.

    New Thyroid:This is a data set of 215 instances for 5 attributes that have been taken from the KEEL repository where 35 ‘Positive’instances are representing the hyperthyroidism as a minority class and 180 ‘Negative’representing the remaining others have been considered the majority class instances.The imbalance ratio of majority and minority classes is 5.14.

    Yeast-2_vs._4:This imbalanced version of the Yeast dataset has been taken from the KEEL repository,having 514 instances for 8 attributes and the classification task is intended for two classes ‘Positive’and ‘Negative’.‘Positive’is the minority class with 51 instances whereas ‘Negative’is the majority class with 463 instances.The imbalance ratio is 9.08.The classification’s task is to localize protein in yeast bacteria.

    A short description of data sets is given in Tab.1.

    6.2 Evaluation Measures

    Accuracy is a popular traditional evaluation measure for classification,but seems insufficient for imbalanced datasets.Though it gives overall good results or accuracy,this accuracy takes place due to the majority class instances which are higher in quantity,neglecting the minority classes not taking into consideration that many times the minority classes are of more interest.Needing to comply with the special treatment of imbalanced datasets,specific measures are also required for evaluation.Many performance evaluation measures have been proposed and performing well on imbalanced data,that consider data distribution with specific metrics,some of them are F-measure,G-mean and AUC,have been evaluated in this paper as well.The confusion metric used to evaluate the classifier for binary data presented in Tab.2.

    Table 1:Brief description of datasets

    Table 2:Confusion metric for binary classification

    True positive (TP) represents the actual positive instances that are classified correctly as positive whereas false positive (FP) represents actual positives incorrectly classified as negative.Similarly true negatives (TN) are actual negative instances,while also correctly classified as negatives and false negatives (FN) are actual negative instances and incorrectly classified as positive.These measures are properly explained in [3].Performance evaluation measures in terms of these metrics are given below in Tab.3:

    We are using F-Measure because it is a more comprehensive metric than accuracy as it comprises the weighted ratio of precision and recall and is sensitive towards data distribution as well.G-Mean is concerned to evaluate the degree of bias in unevenly distributed data.AUC or area under the ROC curve is again a very sensitive measure to work with such binary data points to aggregate the classifier’s performance over all possible threshold values.

    6.3 Empirical Results

    To evaluate the performance of the proposed methodology we took three evaluation measures F-Measure,AUC and G-Mean.Tab.4 contains the results drawn on F-Measure,AUC and GMean of NWKNN,Adpt-NWKNN,Fuzzy-NWKNN and WFAKNN on all eight datasets for five values of K;5 to 25.Most of the result shows the better performance of WFAKNN over other three approaches.

    Table 3:Evaluation measures based on the confusion matrix

    Graphical representation for the comparison on the performance of Fuzzy KNN and Weighted Fuzzy Adaptive KNN (WFAKNN) is given in Figs.1-3 for F-measure,AUC and Gmean for average values of K.These figures show improvements in WFAKNN over Fuzzy KNN for all these measures.This also shows that performance improvement is generalized with different degrees of imbalance for different datasets.

    Table 4:Results for F-measure AUC and G-means for different values of K

    Table 4:Continued

    Table 4:Continued

    7 Significance Testing

    Here t-test [40,41] is applied to find the significant statistical difference between the proposed and existing approaches being used in this work for comparison.This null hypothesis shown by H0implies that there is no significant difference between the existing algorithm and the proposed algorithm.We performed the t-test in MATLAB for significance level 0.05,where statistics are‘h’,‘p’and ‘t’.If the t-test returns the value h=0,the null hypothesis is accepted and if h=1,it rejects the null hypothesis,which implies that there exists a significant difference between our proposed algorithm and the existing one.This could be proven with a smallerpvalue rather than using the significance level of 0.05 and higher value of t (calculated),rather than the value of t(tabulated).In our experiment section we have taken five different values for integer K to evaluate the performance measures F-Measure,AUC and G-Mean.Hence for the degree of freedom=4 (degree of freedom (df)=observation-1 so for K=5;df=4) comparative t-test results for FMeasure on eight datasets are given in Tab.5.We can observe that except Glass0 all datasets are significantly performing better for WFAKNN rather than other algorithms.Also for Glass0 all evaluation measures are showing better results for both different values of K and their average value (Tab.4).

    Figure 1:F-Measure performances of NWKNN,Adpt-NWKNN,Fuzzy-NWKNN and Weighted Fuzzy Adpt KNN

    Figure 2:AUC performances of NWKNN,Adpt-NWKNN,Fuzzy-NWKNN and Weighted Fuzzy Adpt KNN

    Figure 3:G-Mean performances of NWKNN,Adpt-NWKNN,Fuzzy-NWKNN and Weighted Fuzzy Adpt KNN

    Table 5:Statistics of paired t-test for F-Measure of WFAKNN with NWKNN,Adpt-NWKNN and WFAKNN and Fuzzy-NWKNN for the degree of freedom (df)=4 and t (tabulated)=2.776

    8 Conclusions

    In this paper we have proposed a modified weighted fuzzy adaptive nearest neighbor algorithm (WFAKNN) to classify imbalanced data using optimal weights.The fuzzy nearest neighbor approach becomes more impactful while applying weights and then incorporating the adaptive approach to be applied on imbalanced data.One can observe clearly in the results section that for the given evaluation measures the proposed method is performing better than other weighted and/or fuzzy nearest neighbor algorithms.Though the experiments are limited with binary data sets in this paper;they could also be done with multi-class data sets in the future.Moreover,feature selection could be applied to improve the performance.This approach can be applied to recent machine learning studies on the healthcare sector or where using IoT generated data too,because both of them are very sensitive to the accuracy of the classifier and negligence of data distribution can affect it a lot.Some of the possible studies can be extended with the consideration of data imbalance and application of WFAKNN are [42-45] and definitely not limited to.

    Funding Statement: The authors received no specific funding for this study.

    Conflicts of Interest: The authors declare that they have no conflicts of interest to report regarding the present study.

    久久午夜亚洲精品久久| av片东京热男人的天堂| 国产伦人伦偷精品视频| 高清黄色对白视频在线免费看| 黄色视频,在线免费观看| 亚洲少妇的诱惑av| 少妇 在线观看| 男男h啪啪无遮挡| 99热网站在线观看| 欧美精品人与动牲交sv欧美| 国产精品影院久久| 免费黄频网站在线观看国产| 午夜免费观看网址| 久久久国产精品麻豆| 国产午夜精品久久久久久| 51午夜福利影视在线观看| 91av网站免费观看| 十八禁人妻一区二区| 久久久久国内视频| 人妻一区二区av| 欧美老熟妇乱子伦牲交| 女警被强在线播放| 国产精品香港三级国产av潘金莲| 精品国内亚洲2022精品成人 | 搡老熟女国产l中国老女人| 国产精品偷伦视频观看了| 中国美女看黄片| 国产精品香港三级国产av潘金莲| 美女福利国产在线| 免费不卡黄色视频| 国产精品av久久久久免费| 999精品在线视频| 老司机亚洲免费影院| 热re99久久国产66热| 人妻久久中文字幕网| 国产真人三级小视频在线观看| 日韩成人在线观看一区二区三区| 久久久久久久国产电影| 国产99白浆流出| 午夜福利在线观看吧| 国产伦人伦偷精品视频| 日韩人妻精品一区2区三区| 国产亚洲欧美在线一区二区| 欧美 日韩 精品 国产| 亚洲欧美激情在线| netflix在线观看网站| 国产蜜桃级精品一区二区三区 | 午夜精品久久久久久毛片777| 亚洲av成人av| 十八禁网站免费在线| 免费观看人在逋| 99riav亚洲国产免费| 国产精品欧美亚洲77777| 国产1区2区3区精品| 日韩中文字幕欧美一区二区| 午夜福利,免费看| 国产精品久久视频播放| 两个人看的免费小视频| 欧美日韩亚洲高清精品| 国产精品免费一区二区三区在线 | 黑人操中国人逼视频| 日韩欧美国产一区二区入口| 国产有黄有色有爽视频| 狂野欧美激情性xxxx| 亚洲一区二区三区不卡视频| 亚洲国产精品sss在线观看 | 亚洲五月婷婷丁香| 久久人人爽av亚洲精品天堂| 三上悠亚av全集在线观看| 一二三四社区在线视频社区8| 男人的好看免费观看在线视频 | 国产亚洲av高清不卡| 欧美在线黄色| 免费看十八禁软件| 精品久久久久久久久久免费视频 | 久久国产精品影院| 91精品三级在线观看| 欧美亚洲 丝袜 人妻 在线| 少妇的丰满在线观看| 亚洲av片天天在线观看| 欧美av亚洲av综合av国产av| av超薄肉色丝袜交足视频| 欧美成人免费av一区二区三区 | 久久香蕉国产精品| 免费在线观看亚洲国产| 国产精品久久久久成人av| 亚洲少妇的诱惑av| 交换朋友夫妻互换小说| 久久国产精品人妻蜜桃| 久久久久久人人人人人| 午夜福利在线免费观看网站| 亚洲一区高清亚洲精品| 国产精品98久久久久久宅男小说| 亚洲精品美女久久av网站| 黄片大片在线免费观看| 亚洲色图综合在线观看| 日韩制服丝袜自拍偷拍| 久久国产精品大桥未久av| 精品国产一区二区久久| 免费在线观看视频国产中文字幕亚洲| 亚洲精品久久成人aⅴ小说| 久久久久久久精品吃奶| 正在播放国产对白刺激| 91字幕亚洲| 亚洲av成人av| 欧美丝袜亚洲另类 | 丰满人妻熟妇乱又伦精品不卡| 亚洲午夜精品一区,二区,三区| 国产xxxxx性猛交| 亚洲欧美一区二区三区黑人| 国产片内射在线| 国产精品99久久99久久久不卡| 国产亚洲一区二区精品| 一区福利在线观看| 欧美亚洲 丝袜 人妻 在线| 两个人看的免费小视频| 久久精品国产a三级三级三级| 日韩熟女老妇一区二区性免费视频| 黄色片一级片一级黄色片| 一级毛片高清免费大全| 国产成人系列免费观看| 日韩人妻精品一区2区三区| 国产高清videossex| 亚洲精品中文字幕在线视频| 久久中文字幕人妻熟女| 午夜成年电影在线免费观看| 一进一出抽搐gif免费好疼 | 自拍欧美九色日韩亚洲蝌蚪91| 美国免费a级毛片| 国产日韩欧美亚洲二区| 天堂中文最新版在线下载| 18禁裸乳无遮挡动漫免费视频| 欧美乱色亚洲激情| 少妇被粗大的猛进出69影院| 亚洲人成电影免费在线| 超碰成人久久| 国产精品二区激情视频| 久久这里只有精品19| 免费一级毛片在线播放高清视频 | 久久精品亚洲熟妇少妇任你| 欧美日韩亚洲高清精品| 久久ye,这里只有精品| 亚洲avbb在线观看| 国产人伦9x9x在线观看| 咕卡用的链子| 午夜成年电影在线免费观看| e午夜精品久久久久久久| 亚洲精品久久午夜乱码| 人妻丰满熟妇av一区二区三区 | 欧美黑人精品巨大| 性色av乱码一区二区三区2| 人人妻人人爽人人添夜夜欢视频| 久久久久国内视频| 中出人妻视频一区二区| x7x7x7水蜜桃| 国产精品 欧美亚洲| 久久热在线av| 又紧又爽又黄一区二区| 国产日韩欧美亚洲二区| 黑丝袜美女国产一区| tocl精华| 国产极品粉嫩免费观看在线| av一本久久久久| 黄色成人免费大全| 一二三四在线观看免费中文在| 一级a爱视频在线免费观看| 9热在线视频观看99| 国精品久久久久久国模美| 欧美在线黄色| 天天操日日干夜夜撸| 夜夜爽天天搞| 777米奇影视久久| 日日摸夜夜添夜夜添小说| 国产一区二区三区视频了| 久久天堂一区二区三区四区| 麻豆av在线久日| 亚洲人成伊人成综合网2020| 怎么达到女性高潮| 亚洲三区欧美一区| 91成人精品电影| 国产欧美亚洲国产| 欧美在线一区亚洲| 久久草成人影院| 超碰成人久久| 午夜激情av网站| 老熟妇仑乱视频hdxx| 国产成人精品久久二区二区免费| 超碰97精品在线观看| 捣出白浆h1v1| 亚洲熟女精品中文字幕| 亚洲色图综合在线观看| 亚洲国产欧美日韩在线播放| 90打野战视频偷拍视频| 日本vs欧美在线观看视频| 热re99久久国产66热| 婷婷精品国产亚洲av在线 | 亚洲精品国产色婷婷电影| 欧美乱色亚洲激情| 99re在线观看精品视频| 精品福利永久在线观看| 多毛熟女@视频| 久久香蕉精品热| 天堂√8在线中文| 精品熟女少妇八av免费久了| 最新美女视频免费是黄的| 亚洲精品一卡2卡三卡4卡5卡| 国产av一区二区精品久久| 一级片免费观看大全| 日韩欧美在线二视频 | 青草久久国产| 国产亚洲精品久久久久5区| 18禁裸乳无遮挡动漫免费视频| 亚洲国产精品合色在线| 精品亚洲成a人片在线观看| 一区福利在线观看| 久久精品91无色码中文字幕| av网站在线播放免费| 国产激情欧美一区二区| 国产精品亚洲av一区麻豆| 不卡一级毛片| 久久久精品免费免费高清| 一夜夜www| 成人黄色视频免费在线看| xxxhd国产人妻xxx| 美女午夜性视频免费| 免费少妇av软件| 欧美日韩亚洲综合一区二区三区_| 在线av久久热| 十八禁人妻一区二区| 免费在线观看亚洲国产| 亚洲avbb在线观看| 国产成人免费无遮挡视频| 精品电影一区二区在线| 久久久久精品人妻al黑| 亚洲欧美一区二区三区黑人| 中文字幕色久视频| 王馨瑶露胸无遮挡在线观看| 嫩草影视91久久| 成人亚洲精品一区在线观看| 精品久久久久久电影网| 亚洲中文日韩欧美视频| 国产蜜桃级精品一区二区三区 | 色婷婷久久久亚洲欧美| 在线免费观看的www视频| 国产区一区二久久| 欧美亚洲日本最大视频资源| 日韩视频一区二区在线观看| 黄片播放在线免费| 国产真人三级小视频在线观看| 久久久国产成人免费| 岛国在线观看网站| 女性生殖器流出的白浆| 人人澡人人妻人| 精品人妻1区二区| 十八禁人妻一区二区| 超碰成人久久| 亚洲熟妇熟女久久| 男女高潮啪啪啪动态图| 丁香欧美五月| 香蕉国产在线看| 一本综合久久免费| 中文字幕人妻丝袜一区二区| 成人三级做爰电影| 新久久久久国产一级毛片| 亚洲色图 男人天堂 中文字幕| 热99国产精品久久久久久7| 天堂动漫精品| 丰满人妻熟妇乱又伦精品不卡| videos熟女内射| 久久九九热精品免费| 夜夜躁狠狠躁天天躁| 国产又色又爽无遮挡免费看| 久久久久久亚洲精品国产蜜桃av| 12—13女人毛片做爰片一| 亚洲欧美一区二区三区黑人| 日韩免费高清中文字幕av| 欧美老熟妇乱子伦牲交| 可以免费在线观看a视频的电影网站| 欧美亚洲日本最大视频资源| 亚洲一区二区三区欧美精品| 国产免费男女视频| 精品人妻熟女毛片av久久网站| 国产精品秋霞免费鲁丝片| 99久久99久久久精品蜜桃| 日韩欧美三级三区| 欧美中文综合在线视频| 久久国产乱子伦精品免费另类| 男人操女人黄网站| 又黄又爽又免费观看的视频| 亚洲熟妇熟女久久| 久久久久久久久免费视频了| 80岁老熟妇乱子伦牲交| 18禁裸乳无遮挡动漫免费视频| 少妇粗大呻吟视频| 欧美成狂野欧美在线观看| 成年版毛片免费区| videosex国产| 久久人人爽av亚洲精品天堂| 在线免费观看的www视频| av网站在线播放免费| 日韩制服丝袜自拍偷拍| 精品熟女少妇八av免费久了| 亚洲精品中文字幕在线视频| 欧美乱妇无乱码| 色婷婷久久久亚洲欧美| 大陆偷拍与自拍| xxxhd国产人妻xxx| 久久久久精品人妻al黑| 丰满迷人的少妇在线观看| av在线播放免费不卡| 精品国产一区二区久久| ponron亚洲| 亚洲国产欧美一区二区综合| 丝袜美足系列| 亚洲精品粉嫩美女一区| 久久人人爽av亚洲精品天堂| 久久天堂一区二区三区四区| 久久ye,这里只有精品| 国产xxxxx性猛交| 免费在线观看亚洲国产| 欧美+亚洲+日韩+国产| 久9热在线精品视频| 亚洲色图av天堂| 中文字幕另类日韩欧美亚洲嫩草| 亚洲中文日韩欧美视频| 成人免费观看视频高清| 淫妇啪啪啪对白视频| 一区福利在线观看| 人妻久久中文字幕网| 女人高潮潮喷娇喘18禁视频| 国产精品美女特级片免费视频播放器 | 少妇粗大呻吟视频| 国产男女超爽视频在线观看| 免费一级毛片在线播放高清视频 | 一a级毛片在线观看| 十分钟在线观看高清视频www| 男人舔女人的私密视频| 超色免费av| www日本在线高清视频| 成人免费观看视频高清| 看片在线看免费视频| 亚洲第一青青草原| 精品第一国产精品| 午夜福利一区二区在线看| 成年女人毛片免费观看观看9 | 国产伦人伦偷精品视频| 欧美乱色亚洲激情| 黄色视频,在线免费观看| 国产99久久九九免费精品| 亚洲视频免费观看视频| 成熟少妇高潮喷水视频| 亚洲片人在线观看| 午夜久久久在线观看| 免费黄频网站在线观看国产| 日韩一卡2卡3卡4卡2021年| 久久久久久久国产电影| 很黄的视频免费| 精品电影一区二区在线| 国产成人av教育| 国产精品久久久久久精品古装| 久久中文字幕一级| 欧美日韩国产mv在线观看视频| 国产一区有黄有色的免费视频| 最新的欧美精品一区二区| 成人av一区二区三区在线看| 中文字幕高清在线视频| 国产精品一区二区精品视频观看| 午夜福利,免费看| 亚洲人成伊人成综合网2020| 曰老女人黄片| 十八禁网站免费在线| 亚洲成国产人片在线观看| 一区二区三区激情视频| 精品久久久久久久毛片微露脸| 欧美精品人与动牲交sv欧美| 777米奇影视久久| 日本a在线网址| 99久久综合精品五月天人人| 一级a爱视频在线免费观看| 精品电影一区二区在线| 久久国产精品大桥未久av| 精品午夜福利视频在线观看一区| 亚洲熟女毛片儿| 色94色欧美一区二区| 这个男人来自地球电影免费观看| 黄网站色视频无遮挡免费观看| 久久久久久久国产电影| 人人妻人人澡人人爽人人夜夜| 多毛熟女@视频| 成人国产一区最新在线观看| 亚洲黑人精品在线| 怎么达到女性高潮| 一区福利在线观看| 老司机午夜十八禁免费视频| 精品国产亚洲在线| 69av精品久久久久久| 久久人人爽av亚洲精品天堂| 国产精品影院久久| 免费看十八禁软件| 久久精品国产综合久久久| av欧美777| 波多野结衣一区麻豆| 精品久久久精品久久久| 99热只有精品国产| 99热网站在线观看| 欧美日韩黄片免| 18禁美女被吸乳视频| 一级毛片女人18水好多| 韩国精品一区二区三区| 色在线成人网| 久久香蕉激情| x7x7x7水蜜桃| 曰老女人黄片| 欧美激情极品国产一区二区三区| 亚洲在线自拍视频| 免费少妇av软件| 日本wwww免费看| 日本黄色视频三级网站网址 | 法律面前人人平等表现在哪些方面| 亚洲精品国产区一区二| 亚洲精品中文字幕一二三四区| 亚洲 欧美一区二区三区| 午夜福利欧美成人| 国产欧美日韩综合在线一区二区| 亚洲欧美激情在线| 狠狠婷婷综合久久久久久88av| 精品一区二区三区av网在线观看| 久久久久精品人妻al黑| 黄色毛片三级朝国网站| 亚洲人成77777在线视频| 国产亚洲精品一区二区www | 久久婷婷成人综合色麻豆| 99精品久久久久人妻精品| 久久久久久人人人人人| 最新在线观看一区二区三区| 国产淫语在线视频| 高清在线国产一区| 一本一本久久a久久精品综合妖精| 精品国产乱子伦一区二区三区| 丁香六月欧美| 老汉色∧v一级毛片| 丝袜人妻中文字幕| 999久久久精品免费观看国产| 国产熟女午夜一区二区三区| 人人妻,人人澡人人爽秒播| 一进一出抽搐动态| 999久久久精品免费观看国产| 中文字幕人妻熟女乱码| 欧美成人午夜精品| 成人黄色视频免费在线看| 悠悠久久av| 免费在线观看黄色视频的| 好男人电影高清在线观看| 国产麻豆69| 亚洲全国av大片| 国产成人av教育| 91成人精品电影| 人妻久久中文字幕网| 在线永久观看黄色视频| 啦啦啦视频在线资源免费观看| 90打野战视频偷拍视频| 欧美国产精品va在线观看不卡| 天天操日日干夜夜撸| 久久香蕉精品热| 一级作爱视频免费观看| 亚洲美女黄片视频| 女人精品久久久久毛片| 免费少妇av软件| 制服人妻中文乱码| 在线天堂中文资源库| 国产成人免费无遮挡视频| 国内毛片毛片毛片毛片毛片| 国产精品国产高清国产av | 免费在线观看视频国产中文字幕亚洲| 国产一区二区三区在线臀色熟女 | 国产99白浆流出| 满18在线观看网站| 啦啦啦视频在线资源免费观看| 亚洲男人天堂网一区| 国产日韩一区二区三区精品不卡| 国产91精品成人一区二区三区| 多毛熟女@视频| 一区二区日韩欧美中文字幕| e午夜精品久久久久久久| 极品少妇高潮喷水抽搐| 国产精品美女特级片免费视频播放器 | 亚洲国产毛片av蜜桃av| 老鸭窝网址在线观看| 国产乱人伦免费视频| 叶爱在线成人免费视频播放| 色播在线永久视频| 国产欧美日韩精品亚洲av| 午夜免费鲁丝| 19禁男女啪啪无遮挡网站| 黄色怎么调成土黄色| 人人妻人人爽人人添夜夜欢视频| 最新美女视频免费是黄的| 精品人妻在线不人妻| 最新在线观看一区二区三区| 久久久精品区二区三区| 这个男人来自地球电影免费观看| 欧美日韩瑟瑟在线播放| 亚洲av电影在线进入| www.熟女人妻精品国产| 精品一品国产午夜福利视频| 大片电影免费在线观看免费| 韩国精品一区二区三区| 两性午夜刺激爽爽歪歪视频在线观看 | 三上悠亚av全集在线观看| 一级片免费观看大全| 看片在线看免费视频| 涩涩av久久男人的天堂| 黄片大片在线免费观看| 色老头精品视频在线观看| 国内毛片毛片毛片毛片毛片| 99国产精品99久久久久| 欧美激情 高清一区二区三区| 黄色片一级片一级黄色片| 国产乱人伦免费视频| 亚洲av第一区精品v没综合| 热99国产精品久久久久久7| 亚洲一码二码三码区别大吗| 热re99久久精品国产66热6| 极品人妻少妇av视频| 日韩精品免费视频一区二区三区| 精品无人区乱码1区二区| 久久精品国产亚洲av香蕉五月 | 男女免费视频国产| 久久国产亚洲av麻豆专区| 亚洲黑人精品在线| 亚洲 国产 在线| 校园春色视频在线观看| 国产精品九九99| 下体分泌物呈黄色| 黄色片一级片一级黄色片| 一区二区日韩欧美中文字幕| 成人精品一区二区免费| 午夜福利在线观看吧| av线在线观看网站| 搡老乐熟女国产| 99精国产麻豆久久婷婷| 亚洲三区欧美一区| 亚洲成国产人片在线观看| 美女高潮喷水抽搐中文字幕| 欧美 亚洲 国产 日韩一| 视频在线观看一区二区三区| av中文乱码字幕在线| 久久影院123| 大香蕉久久网| 国产精品1区2区在线观看. | 精品国产亚洲在线| 久久久久视频综合| 天堂√8在线中文| 亚洲一区中文字幕在线| 日本一区二区免费在线视频| av视频免费观看在线观看| 免费在线观看日本一区| 亚洲黑人精品在线| 韩国精品一区二区三区| 国产精品一区二区在线不卡| 亚洲性夜色夜夜综合| 国产在视频线精品| 亚洲精品自拍成人| 午夜影院日韩av| 制服人妻中文乱码| 性色av乱码一区二区三区2| 国产精品亚洲一级av第二区| 首页视频小说图片口味搜索| 国产欧美日韩精品亚洲av| 亚洲午夜精品一区,二区,三区| 操出白浆在线播放| 一区福利在线观看| 亚洲国产欧美网| 视频在线观看一区二区三区| 亚洲美女黄片视频| 一级毛片女人18水好多| 久久亚洲真实| 嫩草影视91久久| 成人18禁高潮啪啪吃奶动态图| 欧美日韩乱码在线| 一进一出抽搐gif免费好疼 | 一级黄色大片毛片| 操美女的视频在线观看| 99国产精品免费福利视频| av不卡在线播放| 国产精品久久视频播放| 老汉色∧v一级毛片| 亚洲五月婷婷丁香| 欧美另类亚洲清纯唯美| 亚洲少妇的诱惑av| 很黄的视频免费| 国产xxxxx性猛交| 国产精品欧美亚洲77777| 亚洲成人国产一区在线观看| av国产精品久久久久影院| 国产精品自产拍在线观看55亚洲 | 一进一出好大好爽视频| 99精品在免费线老司机午夜| 成在线人永久免费视频| 欧美日韩av久久| 欧美乱妇无乱码| 12—13女人毛片做爰片一| 成熟少妇高潮喷水视频| 亚洲av电影在线进入| 超碰97精品在线观看| 国产精品欧美亚洲77777| 久久中文字幕人妻熟女| 国产亚洲精品久久久久久毛片 | 99精品欧美一区二区三区四区| 超碰成人久久| 午夜老司机福利片| 男女床上黄色一级片免费看| 又黄又爽又免费观看的视频| 老司机午夜十八禁免费视频| 十八禁网站免费在线|