• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Decision tree methods: applications for classi fication and prediction

    2015-12-09 05:19:37YanyanSONGYingLU
    上海精神醫(yī)學(xué) 2015年2期
    關(guān)鍵詞:決策樹數(shù)據(jù)挖掘分類

    Yan-yan SONG*, Ying LU

    ?Biostatistics in psychiatry (26)?

    Decision tree methods: applications for classi fication and prediction

    Yan-yan SONG1,2*, Ying LU2,3

    decision tree; data mining; classi fication; prediction

    1. Introduction

    Data mining is used to extract useful information from large datasets and to display it in easy-to-interpret visualizations. First introduced in 1960’s, decision trees are one of the most effective methods for data mining;they have been widely used in several disciplines[1]because they are easy to be used, free of ambiguity,and robust even in the presence of missing values.Both discrete and continuous variables can be used either as target variables or independent variables.More recently, decision tree methodology has become popular in medical research. An example of the medical use of decision trees is in the diagnosis of a medical condition from the pattern of symptoms, in which the classes defined by the decision tree could either be different clinical subtypes or a condition, or patients with a condition who should receive different therapies.[2]

    Common usages of decision tree models include the following:

    ?Variable selection. The number of variables that are routinely monitored in clinical settings has increased dramatically with the introduction of electronic data storage. Many of these variables are of marginal relevance and, thus,should probably not be included in data mining exercises. Like stepwise variable selection in regression analysis, decision tree methods can be used to select the most relevant input variables that should be used to form decision tree models, which can subsequently be used to formulate clinical hypotheses and inform subsequent research.

    ?Assessing the relative importance of variables.Once a set of relevant variables is identified,researchers may want to know which variables play major roles. Generally, variable importance is computed based on the reduction of model accuracy (or in the purities of nodes in the tree) when the variable is removed. In most circumstances the more records a variable have an effect on, the greater the importance of the variable.

    ?Handling of missing values. A common – but incorrect – method of handling missing data is to exclude cases with missing values; this is both inefficient and runs the risk of introducing bias in the analysis. Decision tree analysis can deal with missing data in two ways: it can either classify missing values as a separate category that can be analyzed with the other categories or use a built decision tree model which set the variable with lots of missing value as a target variable to make prediction and replace these missing ones with the predicted value.

    ?Prediction. This is one of the most important usages of decision tree models. Using the tree model derived from historical data, it’s easy to predict the result for future records.

    ?Data manipulation. Too many categories of one categorical variable or heavily skewed continuous data are common in medical research. In these circumstances, decision tree models can help in deciding how to best collapse categorical variables into a more manageable number of categories or how to subdivide heavily skewed variables into ranges.

    2. Basic concepts

    Figure 1 illustrates a simple decision tree model that includes a single binary target variable Y (0 or 1) and two continuous variables, x1 and x2, that range from 0 to 1. The main components of a decision tree model are nodes and branches and the most important steps in building a model are splitting, stopping, and pruning.

    Nodes. There are three types of nodes. (a) A root node, also called a decision node, represents a choice that will result in the subdivision of all records into two or more mutually exclusive subsets. (b) Internal nodes,also called chance nodes, represent one of the possible choices available at that point in the tree structure; the top edge of the node is connected to its parent node and the bottom edge is connected to its child nodes or leaf nodes. (c) Leaf nodes, also called end nodes,represent the final result of a combination of decisions or events.

    Branches. Branches represent chance outcomes or occurrences that emanate from root nodes and internal nodes. A decision tree model is formed using a hierarchy of branches. Each path from the root node through internal nodes to a leaf node represents a classi fication decision rule. These decision tree pathways can also be represented as ‘if-then’ rules. For example, “if condition 1 and condition 2 and condition … and condition k occur, then outcome j occurs.”

    Splitting. Only input variables related to the target variable are used to split parent nodes into purer child nodes of the target variable. Both discrete input variables and continuous input variables (which are collapsed into two or more categories) can be used.When building the model one must first identify the most important input variables, and then split records at the root node and at subsequent internal nodes into two or more categories or ‘bins’ based on the status of these variables. Characteristics that are related to the degree of ‘purity’ of the resultant child nodes (i.e., the proportion with the target condition) are used to choose between different potential input variables; these characteristics include entropy, Gini index, classi fication error, information gain, gain ratio, and twoing criteria.[3]This splitting procedure continues until pre-determined homogeneity or stopping criteria are met. In most cases, not all potential input variables will be used to build the decision tree model and in some cases a specific input variable may be used multiple times at different levels of the decision tree.

    Figure 1. Sample decision tree based on binary target variable Y

    Stopping. Complexity and robustness are competing characteristics of models that need to be simultaneously considered whenever building a statistical model.The more complex a model is, the less reliable it will be when used to predict future records. An extreme situation is to build a very complex decision tree model that spreads wide enough to make the records in each leaf node 100% pure (i.e., all records have the target outcome). Such a decision tree would be overly fitted to the existing observations and have few records in each leaf, so it could not reliably predict future cases and, thus, would have poor generalizability (i.e., lack robustness). To prevent this from happening, stopping rules must be applied when building a decision tree to prevent the model from becoming overly complex.Common parameters used in stopping rules include:(a) the minimum number of records in a leaf; (b) the minimum number of records in a node prior to splitting;and (c) the depth (i.e., number of steps) of any leaf from the root node. Stopping parameters must be selected based on the goal of the analysis and the characteristics of the dataset being used. As a rule-of-thumb, Berry and Linoff[4]recommend avoiding over fitting and under fitting by setting the target proportion of records in a leaf node to be between 0.25 and 1.00% of the full training data set.

    Pruning. In some situations, stopping rules do not work well. An alternative way to build a decision tree model is to grow a large tree first, and then prune it to optimal size by removing nodes that provide less additional information.[5]A common method of selecting the best possible sub-tree from several candidates is to consider the proportion of records with error prediction (i.e., the proportion in which the predicted occurrence of the target is incorrect). Other methods of selecting the best alternative is to use a validation dataset (i.e., dividing the sample in two and testing the model developed on the training dataset on the validation dataset), or, for small samples, crossvalidation (i.e., dividing the sample in 10 groups or‘folds’, and testing the model developed from 9 folds on the 10th fold, repeated for all ten combinations, and averaging the rates or erroneous predictions). There are two types of pruning, pre-pruning (forward pruning)and post-pruning (backward pruning). Pre-pruning uses Chi-square tests[6]or multiple-comparison adjustment methods to prevent the generation of non-significant branches. Post-pruning is used after generating a full decision tree to remove branches in a manner that improves the accuracy of the overall classi fication when applied to the validation dataset.

    Decision trees can also be illustrated as segmented space, as shown in Figure 2. The sample space is subdivided into mutually exclusive (and collectively exhaustive) segments, where each segment corresponds to a leaf node (that is, the final outcome of the serial decision rules). Each record is allocated to a single segment (leaf node). Decision tree analysis aims to identify the best model for subdividing all records into different segments.

    Figure 2. Decision tree illustrated using sample space view

    3. Available algorithms and software packages for building decision tree models

    Several statistical algorithms for building decision trees are available, including CART (Classification and Regression Trees),[7]C4.5,[8]CHAID (Chi-Squared Automatic Interaction Detection),[9]and QUEST (Quick,Unbiased, Efficient, Statistical Tree).[10]Table 1 provides a brief comparison of the four most widely used decision tree methods.[11,12]

    Decision trees based on these algorithms can be constructed using data mining software that is included in widely available statistical software packages. For example, there is one decision tree dialogue box in SAS Enterprise Miner[13]which incorporates all four algorithms; the dialogue box requires the user to specify several parameters of the desired model.

    The IBM SPSS Modeler[14]software package is more user-friendly; it includes four separate dialog boxes, one for each of four algorithms (it uses C5.0,[15]an upgradedversion of C4.5). Based on the desired method of selecting input variables, the user goes to the dialog box for the corresponding algorithm (i.e., using the following steps: Analyze menu ==> Classify ==> Tree==>select algorithm of choice). For example, the SPSS syntax associated with the CART algorithm dialog box[16]would be as follows:

    Table 1. Comparison of different decision tree algorithms

    tree y [n] by x1 [s] x2 [c] x3 [o]

    /tree display=topdown nodes=statistics

    branchstatistics=yes nodedefs=yes scale=auto

    /depcategories usevalues=[valid]

    /print modelsummary classi fication risk

    /method type=crt maxsurrogates=auto prune=none

    /growthlimit maxdepth=auto minparentsize=100

    minchildsize=50

    /validation type=none output=bothsamples

    /crt impurity=gini minimprovement=0.0001

    /costs equal

    /priors fromdata adjust=no.

    NOTE: [n], [s], [c], [o] indicate the variables are

    nominal, scale, categorical and ordinal.

    4. Example

    We use the analysis of risk factors related to major depressive disorder (MDD) in a four-year cohort study[17]to illustrate the building of a decision tree model. The goal of the analysis was to identify the most important risk factors from a pool of 17 potential risk factors, including gender, age, smoking, hypertension,education, employment, life events, and so forth. The decision tree model generated from the dataset is shown in Figure 3.

    All individuals were divided into 28 subgroups from root node to leaf nodes through different branches. The risk of having depressive disorder varied from 0 to 38%.For example, only 2% of the non-smokers at baseline had MDD four years later, but 17.2% of the male smokers, who had a score of 2 or 3 on the Goldberg depression scale and who did not have a fulltime job at baseline had MDD at the 4-year follow-up evaluation. By using this type of decision tree model, researchers can identify the combinations of factors that constitute the highest (or lowest) risk for a condition of interest.

    Figure 3. Decision tree predicting the risk of major depressive disorder based on findings from a four-year cohort study (reprinted with permission from Batterham et al.[17])

    5. Discussion

    The decision tree method is a powerful statistical tool for classification, prediction, interpretation, and data manipulation that has several potential applications in medical research. Using decision tree models to describe research findings has the following advantages:

    ? Simpli fies complex relationships between input variables and target variables by dividing original input variables into signi ficant subgroups.

    ? Easy to understand and interpret.

    ? Non-parametric approach without distributional assumptions.

    ? Easy to handle missing values without needing to resort to imputation.

    ? Easy to handle heavy skewed data without needing to resort to data transformation.

    ? Robust to outliers.

    As with all analytic methods, there are also limitations of the decision tree method that users must be aware of. The main disadvantage is that it can be subject to overfitting and underfitting, particularly when using a small data set. This problem can limit the generalizability and robustness of the resultant models.Another potential problem is that strong correlation between different potential input variables may result in the selection of variables that improve the model statistics but are not causally related to the outcome of interest. Thus, one must be cautious when interpreting decision tree models and when using the results of these models to develop causal hypotheses.

    Lohand and Strobl[18,19]provided a comprehensive review of the statistical literature of classification tree methods that may be useful for readers who want to learn more about the statistical theories behind the decision tree method. There are several further applications of decision tree models that have not been considered in this brief overview. We have described decision tree models that use binary or continuous target variables; several authors have developed other decision tree methods to be employed when the endpoint is the prediction of survival.[20-27]Our discussion was limited to cases in which the selection of input variables was based on statistical properties,but in the real world selection of input variables may be based on the relative cost of collecting the variables or on the clinical meaningfulness of the variables; Jin and colleagues[28,29]introduced an alternative classification tree method that allows for the selection of input variables based on a combination of preference (e.g.,based on cost) and non-inferiority to the statistically optimal split. Another extension of the decision tree method is to develop a decision tree that identifies subgroups of patients who should have different diagnostic tests or treatment strategies to achieve optimal medical outcomes.[30]

    Con flict of interest

    The authors report no con flict of interest related to this manuscript.

    Funding

    1. Hastie TJ, Tibshirani, RJ, Friedman JH.The Elements of Statistical Learning: Data Mining, Inference and Prediction.Second Edition. Springer; 2009. ISBN 978-0-387-84857-0

    2. Fallon B, Ma J, Allan K, Pillhofer M, Trocmé N, Jud A.Opportunities for prevention and intervention with young children: lessons from the Canadian incidence study of reported child abuse and neglect.Child Adolesc Psychiatry Ment Health.2013; 7: 4

    3. Patel N, Upadhyay S. Study of various decision tree pruning methods with their empirical comparison in WEKA.Int J Comp Appl; 60 (12): 20-25

    4. Berry MJA, Linoff G.Mastering Data Mining: The Art and Science of Customer Relationship Management. New York:John Wiley & Sons, Inc.; 1999

    5. Hastie T, Tibshirani R, Friedman J.The Elements of Statistical Learning. Springer; 2001. pp: 269-272

    6. Zibran MF.CHI-Squared Test of Independence. Department of Computer Science, University of Calgary, Alberta, Canada;2012

    7. Breiman L, Friedman JH, Olshen RA, Stone CJ.Classi fication and Regression Trees. Belmont, California: Wadsworth, Inc.;1984

    8. Quinlan RJ.C4.5: Programs for Machine Learning. San Mateo, California: Morgan Kaufmann Publishers, Inc.; 1993

    9. Kass, GV. An exploratory technique for investigating large quantities of categorical data.Appl Stat.1980; 29: 119-127

    10. Loh W, Shih Y. Split selection methods for classi fication trees.Statistica Sinica.1997; 7: 815-840

    11. Bhukya DP, Ramachandram S. Decision tree induction-an approach for data classi fication using AVL–Tree.Int J Comp d Electrical Engineering.2010; 2(4): 660-665. doi: http://dx.doi.org/10.7763/IJCEE.2010.V2.208

    12. Lin N, Noe D, He X. Tree-based methods and their applications. In: PhamH, Ed.Springer Handbook of Engineering Statistics. London: Springer-Verlag; 2006. pp.551-570

    13. SAS Institute Inc.SAS Enterprise Miner12.1 Reference Help,Second Edition. USA: SAS Institute Inc; 2011

    14. IBM Corporation.IBM SPSS Modeler 17 Modeling Nodes.USA: IBM Corporation; 2015

    15. Is See5/C5.0 Better Than C4.5? [Internet]. Australia: Rulequest Research; c1997-2008 [updated 2008 Oct; cited 2015 April].Available from: http://rulequest.com/see5-comparison.html

    16. IBM Corporation.IBM SPSS Statistics 23 Command Syntax Reference. USA: IBM Corporation; 2015

    17. Batterham PJ, Christensen H, Mackinnon AJ. Modi fiable risk factors predicting major depressive disorder at four-year follow-up: a decision tree approach.BMC Psychiatry. 2009;9: 75. doi: http://dx.doi.org/10.1186/1471-244X-9-75

    18. Loh WY. Fifty years of classification and regression trees.Int Stat Rev.2014; 82(3): 329-348 .doi: http://dx.doi.org/10.1111/insr.12016

    19. Strobl C. Discussions.Int Stat Rev.2014; 82(3): 349-352. doi:http://dx.doi.org/10.1111/insr.12059

    20. Segal M. Regression trees for censored data.Biometrics.1988; 44: 35-47

    21. Segal M, Bloch D. A comparison of estimated proportional hazards models and regression trees.Stat Med.1989; 8:539-550.

    22. Segal M. Features of tree-structured survival analysis.Epidemiol. 1997; 8: 344-346

    23. Therneau T, Grambsch P, Fleming T. Martingale based residuals for survival models.Biometrika. 1990; 77: 147-160 24. LeBlanc M, Crowley J. Relative risk trees for censored survival data.Biometrics.1992; 48: 411-425

    25. Keles S, Segal M. Residual-based tree-structured survival analysis.Stat Med.2002; 21: 313-326

    26. Zhang HP.Splitting criteria in survival trees. 10th International Workshop on Statistical Modeling. Innsbruck (Austria):Springer- Verlag; 1995. p.305–314

    27. Jin H, Lu Y, Stone K, Black DM. Alternative tree structured survival analysis based on variance of survival time.Med Decis Making.2004; 24(6): 670-680. doi: http://dx.doi.org/10.1177/0272989X10377117

    28. Jin H, Lu Y, Harris ST, Black DM, Stone K, Hochberg MC,Genant HK. Classification algorithm for hip fracture prediction based on recursive partitioning methods.Med Decis Making. 2004; 24(4): 386-398. doi: http://dx.doi.org/10.1177/0272989X04267009

    29. Jin H, Lu Y. A procedure for determining whether a simple combination of diagnostic tests may be noninferior to the theoretical optimum combination.Med Decis Making.2008; 28(6): 909-916. doi: http://dx.doi.org/10.1177/0272989X08318462

    30. Li C, Gluer CC, Eastell R, Felsenberg D, Reid DM, Rox DM, Lu Y. Tree-structured subgroup analysis of receiver operating characteristic curves for diagnostic tests.Acad Radiol.2012;19(12): 1529-1536. doi: http://dx.doi.org/10.1016/j.acra.2012.09.007

    (received, 2015-04-01; accepted, 2015-04-09)

    Dr. Yan-yan Song is a Lecturer in the Department of Biostatics at the Shanghai Jiao Tong University School of Medicine who is currently a visiting scholar in the Division of Biostatistics, Department of Health Research and Policy, Stanford University School of Medicine. She is responsible for the data management and statistical analysis platform of the Translational Medicine Collaborative Innovation Center of the Shanghai Jiao Tong University. She is a fellow in the China Association of Biostatistics and a member on the Ethics Committee for Ruijin Hospital, which is Affiliated with the Shanghai Jiao Tong University. She has experience in the statistical analysis of clinical trials, diagnostic studies, and epidemiological surveys, and has used decision tree analyses to search for the biomarkers of early depression.

    用于分類與預(yù)測(cè)的決策樹分析

    Song YY, Lu Y

    決策樹;數(shù)據(jù)挖掘;分類;預(yù)測(cè)

    Summary:Decision tree methodology is a commonly used data mining method for establishing classi fication systems based on multiple covariates or for developing prediction algorithms for a target variable. This method classifies a population into branch-like segments that construct an inverted tree with a root node, internal nodes, and leaf nodes. The algorithm is non-parametric and can efficiently deal with large,complicated datasets without imposing a complicated parametric structure. When the sample size is large enough, study data can be divided into training and validation datasets. Using the training dataset to build a decision tree model and a validation dataset to decide on the appropriate tree size needed to achieve the optimal final model. This paper introduces frequently used algorithms used to develop decision trees(including CART, C4.5, CHAID, and QUEST) and describes the SPSS and SAS programs that can be used to visualize tree structure.

    [Shanghai Arch Psychiatry. 2015; 27(2): 130-135.

    http://dx.doi.org/10.11919/j.issn.1002-0829.215044]

    1Department of Pharmacology and Biostatistics, Institute of Medical Sciences, Shanghai Jiao Tong University School of Medicine, Shanghai, China

    2Division of Biostatistics, Department of Health Research and Policy, Stanford University, Stanford, CA, USA

    3Veterans Affairs Cooperative Studies Program Palo Alto Coordinating Center, the VA Palo Alto Health Care System, Palo Alto, CA, USA

    *correspondence: yanyansong@sjtu.edu.cn

    no funding for the preparation of this report.

    概述:決策樹是一種常用的數(shù)據(jù)挖掘方法,用于多變量分析時(shí)建立分類系統(tǒng)或制定預(yù)測(cè)結(jié)果變量的算法。此方法將一個(gè)數(shù)據(jù)群分割成分枝狀節(jié)段,構(gòu)造出包括根節(jié)點(diǎn)、內(nèi)部節(jié)點(diǎn)和葉節(jié)點(diǎn)的倒置形樹狀模型。該算法運(yùn)用非參數(shù)方法,不需要套用任何復(fù)雜的參數(shù)模型就能有效地處理大型復(fù)雜的數(shù)據(jù)庫。當(dāng)樣本足夠大時(shí),可將研究數(shù)據(jù)分為訓(xùn)練數(shù)據(jù)集和驗(yàn)證數(shù)據(jù)集。使用訓(xùn)練數(shù)據(jù)集構(gòu)建決策樹模型,使用驗(yàn)證數(shù)據(jù)集來決定樹的適合大小,以獲得最優(yōu)模型。本文介紹了構(gòu)建決策樹的常用算法(包括CART,C4.5,CHAID和QUEST),并描述了SPSS和SAS軟件中將樹結(jié)構(gòu)可視化的程序。

    本文全文中文版從2015年06月06日起在http://dx.doi.org/10.11919/j.issn.1002-0829.215044可供免費(fèi)閱覽下載

    猜你喜歡
    決策樹數(shù)據(jù)挖掘分類
    分類算一算
    探討人工智能與數(shù)據(jù)挖掘發(fā)展趨勢(shì)
    一種針對(duì)不均衡數(shù)據(jù)集的SVM決策樹算法
    分類討論求坐標(biāo)
    決策樹和隨機(jī)森林方法在管理決策中的應(yīng)用
    電子制作(2018年16期)2018-09-26 03:27:06
    數(shù)據(jù)分析中的分類討論
    教你一招:數(shù)的分類
    基于并行計(jì)算的大數(shù)據(jù)挖掘在電網(wǎng)中的應(yīng)用
    電力與能源(2017年6期)2017-05-14 06:19:37
    基于決策樹的出租車乘客出行目的識(shí)別
    一種基于Hadoop的大數(shù)據(jù)挖掘云服務(wù)及應(yīng)用
    亚洲精品成人久久久久久| 中文字幕人妻熟人妻熟丝袜美| 久久精品久久久久久久性| 直男gayav资源| 欧美色欧美亚洲另类二区| 精品99又大又爽又粗少妇毛片| 国产av麻豆久久久久久久| 国产成人a区在线观看| 免费一级毛片在线播放高清视频| 国产成人freesex在线| 天天一区二区日本电影三级| 婷婷亚洲欧美| 国产三级中文精品| 黑人高潮一二区| 国产成人福利小说| 亚洲av第一区精品v没综合| 国模一区二区三区四区视频| 亚洲人与动物交配视频| 国产精品伦人一区二区| 免费黄网站久久成人精品| 国产亚洲av片在线观看秒播厂 | 日韩大尺度精品在线看网址| 日韩一区二区视频免费看| 精品少妇黑人巨大在线播放 | 国产精品一区二区在线观看99 | 麻豆一二三区av精品| 国产伦理片在线播放av一区 | 哪里可以看免费的av片| 精华霜和精华液先用哪个| av国产免费在线观看| 两个人的视频大全免费| 男人舔奶头视频| 国产精品国产三级国产av玫瑰| 亚洲av一区综合| 男人的好看免费观看在线视频| 国产精品久久久久久av不卡| 国产成年人精品一区二区| 国产成人精品婷婷| 国产精品日韩av在线免费观看| 一级毛片久久久久久久久女| 18禁在线播放成人免费| 麻豆成人午夜福利视频| 色尼玛亚洲综合影院| 中文字幕久久专区| 日韩成人伦理影院| 黄色一级大片看看| av在线蜜桃| 少妇丰满av| 一边亲一边摸免费视频| 日韩三级伦理在线观看| 日本av手机在线免费观看| 高清毛片免费观看视频网站| 可以在线观看的亚洲视频| 成熟少妇高潮喷水视频| 18+在线观看网站| 国产老妇女一区| 91aial.com中文字幕在线观看| 亚洲一区二区三区色噜噜| 精品熟女少妇av免费看| 亚洲精品成人久久久久久| 久久人人爽人人爽人人片va| 国产高清有码在线观看视频| 日韩 亚洲 欧美在线| 男女那种视频在线观看| 熟妇人妻久久中文字幕3abv| 久久久久久久久久成人| 国产精品电影一区二区三区| 欧美最黄视频在线播放免费| 97超碰精品成人国产| 国产伦理片在线播放av一区 | 欧美+亚洲+日韩+国产| 国产真实伦视频高清在线观看| 久久精品国产自在天天线| 日韩欧美三级三区| 国产高清视频在线观看网站| 免费观看人在逋| 欧美成人a在线观看| 精品无人区乱码1区二区| 啦啦啦韩国在线观看视频| 成人亚洲欧美一区二区av| av免费观看日本| 久久久久久久久大av| 欧美在线一区亚洲| 国产在视频线在精品| 三级男女做爰猛烈吃奶摸视频| 热99在线观看视频| 男插女下体视频免费在线播放| 一本精品99久久精品77| 在线天堂最新版资源| 日韩欧美三级三区| 51国产日韩欧美| 一级毛片aaaaaa免费看小| 亚洲中文字幕一区二区三区有码在线看| 久久99热这里只有精品18| 男女视频在线观看网站免费| 乱码一卡2卡4卡精品| 看免费成人av毛片| 久久婷婷人人爽人人干人人爱| 一个人观看的视频www高清免费观看| 一卡2卡三卡四卡精品乱码亚洲| 亚洲无线观看免费| 国产三级在线视频| 国产精品精品国产色婷婷| 最近2019中文字幕mv第一页| av专区在线播放| 日韩人妻高清精品专区| 成人漫画全彩无遮挡| 中文字幕精品亚洲无线码一区| 国产伦精品一区二区三区四那| 两个人的视频大全免费| 亚洲熟妇中文字幕五十中出| 秋霞在线观看毛片| 成年av动漫网址| 欧美日本视频| 国产爱豆传媒在线观看| 美女黄网站色视频| 毛片女人毛片| 麻豆一二三区av精品| 国产三级中文精品| 一区二区三区四区激情视频 | 看非洲黑人一级黄片| а√天堂www在线а√下载| 99久国产av精品| 国产精品1区2区在线观看.| 久久中文看片网| 亚洲国产色片| 白带黄色成豆腐渣| 亚洲熟妇中文字幕五十中出| 青春草国产在线视频 | 长腿黑丝高跟| 国产精品一二三区在线看| 看免费成人av毛片| 99在线人妻在线中文字幕| 美女 人体艺术 gogo| 国产成人一区二区在线| 色吧在线观看| 精品人妻熟女av久视频| 欧美不卡视频在线免费观看| 精品久久久久久久久av| 韩国av在线不卡| 天天躁夜夜躁狠狠久久av| 精品久久久噜噜| 亚洲精品亚洲一区二区| 美女xxoo啪啪120秒动态图| 男人狂女人下面高潮的视频| 夜夜看夜夜爽夜夜摸| 男女边吃奶边做爰视频| 久久久国产成人精品二区| 国语自产精品视频在线第100页| 国产精品,欧美在线| 成人av在线播放网站| 欧美变态另类bdsm刘玥| 男女下面进入的视频免费午夜| 18禁在线无遮挡免费观看视频| 亚洲在线观看片| 在线观看av片永久免费下载| 国产精品久久久久久av不卡| 国产精品野战在线观看| 久久精品夜夜夜夜夜久久蜜豆| 性色avwww在线观看| 免费一级毛片在线播放高清视频| 久久99热这里只有精品18| av女优亚洲男人天堂| 国产成人午夜福利电影在线观看| www.色视频.com| 亚洲国产色片| 欧美最新免费一区二区三区| 麻豆国产97在线/欧美| 婷婷色av中文字幕| 国产免费一级a男人的天堂| 国产精品爽爽va在线观看网站| 午夜精品国产一区二区电影 | 亚洲aⅴ乱码一区二区在线播放| 免费黄网站久久成人精品| 身体一侧抽搐| 岛国在线免费视频观看| 日韩视频在线欧美| 91久久精品电影网| 国产一区二区在线av高清观看| 国产真实乱freesex| 亚洲精品乱码久久久v下载方式| 亚洲欧美精品自产自拍| 亚洲aⅴ乱码一区二区在线播放| 日本在线视频免费播放| 日韩一本色道免费dvd| а√天堂www在线а√下载| 99久久成人亚洲精品观看| 少妇高潮的动态图| 亚洲无线在线观看| 亚洲成a人片在线一区二区| 日本三级黄在线观看| 岛国毛片在线播放| 听说在线观看完整版免费高清| 亚洲欧美精品综合久久99| 亚洲性久久影院| 国产极品天堂在线| 性插视频无遮挡在线免费观看| 中文字幕制服av| 人妻制服诱惑在线中文字幕| 精品久久久久久久久久免费视频| 亚洲第一区二区三区不卡| 免费电影在线观看免费观看| 亚洲国产精品成人久久小说 | 国产精品久久久久久精品电影| 亚洲欧美日韩卡通动漫| 看黄色毛片网站| 亚洲精品乱码久久久久久按摩| 男人舔女人下体高潮全视频| 日日摸夜夜添夜夜爱| 成年av动漫网址| 波野结衣二区三区在线| 久久99热这里只有精品18| 国产中年淑女户外野战色| 小蜜桃在线观看免费完整版高清| 美女大奶头视频| 久久九九热精品免费| 亚洲人成网站高清观看| 边亲边吃奶的免费视频| 国产精品一及| 一级二级三级毛片免费看| 别揉我奶头 嗯啊视频| 欧美激情久久久久久爽电影| 久久精品久久久久久噜噜老黄 | 91aial.com中文字幕在线观看| 亚洲国产精品成人综合色| 啦啦啦啦在线视频资源| 九草在线视频观看| 亚洲精品乱码久久久久久按摩| 成人午夜高清在线视频| 国产精品蜜桃在线观看 | 国产单亲对白刺激| 熟女电影av网| 国产成人91sexporn| 免费人成视频x8x8入口观看| 久久精品国产99精品国产亚洲性色| 国产精品一区二区在线观看99 | 亚洲精品亚洲一区二区| 国产精品三级大全| 麻豆av噜噜一区二区三区| 国产av麻豆久久久久久久| 身体一侧抽搐| 亚洲国产欧美人成| 麻豆国产av国片精品| 午夜精品国产一区二区电影 | 亚洲av二区三区四区| www日本黄色视频网| av在线播放精品| 亚洲图色成人| 亚洲国产精品sss在线观看| av在线播放精品| 成年av动漫网址| 变态另类成人亚洲欧美熟女| 国产成人福利小说| 精品久久久久久久久久久久久| 在线观看66精品国产| 日日撸夜夜添| 熟女人妻精品中文字幕| 色哟哟哟哟哟哟| 欧美一区二区国产精品久久精品| 国产精品永久免费网站| 国产三级在线视频| 国产精华一区二区三区| 岛国毛片在线播放| 久久人人爽人人爽人人片va| 国内久久婷婷六月综合欲色啪| 国产一级毛片七仙女欲春2| 嫩草影院新地址| 波多野结衣巨乳人妻| av在线观看视频网站免费| 男插女下体视频免费在线播放| 一级毛片aaaaaa免费看小| 国产精品久久久久久精品电影| 在现免费观看毛片| 亚洲人成网站高清观看| 免费av毛片视频| 成人无遮挡网站| 亚洲在线自拍视频| www.色视频.com| 国产一区二区在线av高清观看| 国产在视频线在精品| 18禁黄网站禁片免费观看直播| 亚洲婷婷狠狠爱综合网| 黄色配什么色好看| 久久精品影院6| 日本三级黄在线观看| 国内精品美女久久久久久| 六月丁香七月| 成人美女网站在线观看视频| 在线免费观看的www视频| 在现免费观看毛片| 午夜福利在线观看免费完整高清在 | 精品人妻偷拍中文字幕| 国产国拍精品亚洲av在线观看| 亚洲内射少妇av| 99久国产av精品| 国产精品国产高清国产av| 99九九线精品视频在线观看视频| 极品教师在线视频| 国产极品精品免费视频能看的| 亚洲精品影视一区二区三区av| 啦啦啦观看免费观看视频高清| 91午夜精品亚洲一区二区三区| 我要搜黄色片| 免费av观看视频| 3wmmmm亚洲av在线观看| 老司机影院成人| 亚洲av中文字字幕乱码综合| 亚洲国产欧美在线一区| 一级黄片播放器| 亚洲色图av天堂| 超碰av人人做人人爽久久| 蜜桃久久精品国产亚洲av| 欧美日韩一区二区视频在线观看视频在线 | 久久久久久伊人网av| 男人的好看免费观看在线视频| 成人欧美大片| av天堂在线播放| 麻豆成人午夜福利视频| 成年版毛片免费区| 欧美zozozo另类| 欧美激情国产日韩精品一区| 亚洲欧美精品专区久久| 欧美色欧美亚洲另类二区| 国产乱人视频| 欧美成人a在线观看| 亚洲国产精品合色在线| 国产亚洲5aaaaa淫片| 成人亚洲欧美一区二区av| 国产亚洲5aaaaa淫片| 色综合站精品国产| 日本黄色片子视频| 不卡视频在线观看欧美| 永久网站在线| 国产视频首页在线观看| 日本一二三区视频观看| 午夜精品国产一区二区电影 | 有码 亚洲区| 在现免费观看毛片| 黄色欧美视频在线观看| 国产极品天堂在线| 久久精品综合一区二区三区| 国产淫片久久久久久久久| 日本一本二区三区精品| 天堂中文最新版在线下载 | 欧美日韩综合久久久久久| 少妇的逼好多水| 国产单亲对白刺激| eeuss影院久久| 久久久久网色| 免费观看在线日韩| av卡一久久| 一个人看的www免费观看视频| 99在线人妻在线中文字幕| 成人美女网站在线观看视频| 欧美日韩在线观看h| 亚洲最大成人手机在线| 国产伦在线观看视频一区| 中文欧美无线码| 1024手机看黄色片| 日本免费一区二区三区高清不卡| 一边摸一边抽搐一进一小说| 亚洲经典国产精华液单| 舔av片在线| 欧美变态另类bdsm刘玥| 日韩欧美三级三区| 乱系列少妇在线播放| 两个人的视频大全免费| 丝袜美腿在线中文| 亚洲色图av天堂| 成人特级av手机在线观看| 国产精品电影一区二区三区| 99热网站在线观看| 国产综合懂色| 哪个播放器可以免费观看大片| 亚洲精品影视一区二区三区av| 黄色欧美视频在线观看| 91麻豆精品激情在线观看国产| 国产中年淑女户外野战色| 日日啪夜夜撸| 久久久久网色| 国产精品综合久久久久久久免费| 亚洲五月天丁香| 一个人看的www免费观看视频| 亚洲无线在线观看| 亚洲第一电影网av| 亚洲自偷自拍三级| 国产精品久久久久久久久免| 久久精品国产自在天天线| 嫩草影院入口| 熟女电影av网| 欧美在线一区亚洲| 插逼视频在线观看| 精品免费久久久久久久清纯| 国语自产精品视频在线第100页| 噜噜噜噜噜久久久久久91| 欧美日本视频| 亚洲av第一区精品v没综合| 精品久久久久久久人妻蜜臀av| 亚洲精品国产成人久久av| 国产 一区 欧美 日韩| av免费在线看不卡| 亚洲人成网站在线播| 99久久精品热视频| 特大巨黑吊av在线直播| 亚洲综合色惰| 久久午夜亚洲精品久久| 国内少妇人妻偷人精品xxx网站| 人妻少妇偷人精品九色| 三级国产精品欧美在线观看| 国产精品一区www在线观看| 婷婷精品国产亚洲av| 亚洲va在线va天堂va国产| 国产伦在线观看视频一区| 青春草亚洲视频在线观看| 中文资源天堂在线| 丰满的人妻完整版| 联通29元200g的流量卡| 久久久久久大精品| 国产探花在线观看一区二区| 乱人视频在线观看| 国产黄片视频在线免费观看| 99久久精品一区二区三区| 床上黄色一级片| 欧美高清性xxxxhd video| 国产伦精品一区二区三区视频9| 人妻系列 视频| 国产 一区精品| 一本久久中文字幕| 国产乱人视频| 亚洲精品456在线播放app| 国内精品一区二区在线观看| 好男人视频免费观看在线| 亚洲精华国产精华液的使用体验 | 人妻久久中文字幕网| 亚洲欧洲国产日韩| 亚洲欧美中文字幕日韩二区| 久久久久久久亚洲中文字幕| 国内精品久久久久精免费| 国产在线男女| 亚洲精品自拍成人| 国模一区二区三区四区视频| 99久久成人亚洲精品观看| 国产极品天堂在线| 国产成人福利小说| 夜夜夜夜夜久久久久| .国产精品久久| 欧美精品国产亚洲| 久久午夜亚洲精品久久| 少妇高潮的动态图| 色噜噜av男人的天堂激情| 能在线免费看毛片的网站| 熟妇人妻久久中文字幕3abv| 国产一区二区三区av在线 | 最近的中文字幕免费完整| 久久人人精品亚洲av| 日韩 亚洲 欧美在线| 日韩欧美精品v在线| 亚洲人与动物交配视频| www日本黄色视频网| 美女高潮的动态| 国产伦精品一区二区三区视频9| 日本黄色片子视频| 免费av毛片视频| 99热精品在线国产| 国产精品伦人一区二区| 国产精品一区二区三区四区免费观看| 美女xxoo啪啪120秒动态图| 亚洲丝袜综合中文字幕| 亚洲av一区综合| 看片在线看免费视频| 亚洲综合色惰| 欧美一级a爱片免费观看看| 国产成人91sexporn| 人妻少妇偷人精品九色| 午夜福利在线观看免费完整高清在 | 一个人观看的视频www高清免费观看| 亚洲无线观看免费| 18禁在线无遮挡免费观看视频| 久久精品国产亚洲av涩爱 | 亚洲精品456在线播放app| 久久婷婷人人爽人人干人人爱| 国产单亲对白刺激| 午夜福利高清视频| 中出人妻视频一区二区| 国产精品久久久久久亚洲av鲁大| 久久精品久久久久久久性| 天天躁夜夜躁狠狠久久av| 久久午夜亚洲精品久久| h日本视频在线播放| 国产国拍精品亚洲av在线观看| 亚洲三级黄色毛片| 国产成年人精品一区二区| 晚上一个人看的免费电影| www日本黄色视频网| 欧美3d第一页| 色吧在线观看| 亚洲国产精品国产精品| 国产爱豆传媒在线观看| 亚洲av电影不卡..在线观看| 久久精品久久久久久噜噜老黄 | 精品无人区乱码1区二区| 黄片wwwwww| 麻豆av噜噜一区二区三区| 欧美一区二区精品小视频在线| 国产精品久久久久久久久免| 变态另类成人亚洲欧美熟女| 亚洲国产精品成人综合色| 在线国产一区二区在线| 一区二区三区四区激情视频 | 久久精品国产自在天天线| 久久人妻av系列| 婷婷六月久久综合丁香| 精品无人区乱码1区二区| 91午夜精品亚洲一区二区三区| 美女内射精品一级片tv| 青春草视频在线免费观看| 亚洲真实伦在线观看| 亚洲va在线va天堂va国产| 日韩一区二区视频免费看| 亚洲熟妇中文字幕五十中出| 国产亚洲欧美98| 搞女人的毛片| 中文精品一卡2卡3卡4更新| 少妇熟女欧美另类| 伦理电影大哥的女人| 我的女老师完整版在线观看| 啦啦啦啦在线视频资源| 你懂的网址亚洲精品在线观看 | 国产精品综合久久久久久久免费| 国产一区二区在线av高清观看| 久久精品久久久久久久性| 国产爱豆传媒在线观看| 国产黄色视频一区二区在线观看 | 日韩大尺度精品在线看网址| 国产精品一及| 欧美色视频一区免费| 欧美高清成人免费视频www| 亚洲国产欧美人成| 国产 一区 欧美 日韩| 久久婷婷人人爽人人干人人爱| 国产麻豆成人av免费视频| 亚洲av二区三区四区| 天天躁日日操中文字幕| 在线国产一区二区在线| 久久精品夜夜夜夜夜久久蜜豆| 91久久精品国产一区二区成人| 青春草视频在线免费观看| 国产黄a三级三级三级人| 国产精品伦人一区二区| 蜜臀久久99精品久久宅男| 老司机影院成人| 我的老师免费观看完整版| 精品免费久久久久久久清纯| 国产成人精品一,二区 | 国产探花在线观看一区二区| 日韩三级伦理在线观看| 男人狂女人下面高潮的视频| 91精品一卡2卡3卡4卡| 99热网站在线观看| 51国产日韩欧美| 国产老妇女一区| a级毛片免费高清观看在线播放| 午夜福利成人在线免费观看| 波多野结衣高清无吗| 有码 亚洲区| 欧美成人精品欧美一级黄| 国产免费一级a男人的天堂| 岛国在线免费视频观看| 国产精品电影一区二区三区| 久久精品久久久久久噜噜老黄 | 国产在线男女| 久久久精品欧美日韩精品| 国产大屁股一区二区在线视频| 在线观看av片永久免费下载| 能在线免费看毛片的网站| 亚洲av中文字字幕乱码综合| 色哟哟哟哟哟哟| 中文字幕熟女人妻在线| 99热这里只有是精品50| 欧美日韩精品成人综合77777| 日韩 亚洲 欧美在线| 高清午夜精品一区二区三区 | av女优亚洲男人天堂| 国产精品一区二区三区四区免费观看| 国产国拍精品亚洲av在线观看| 三级毛片av免费| 国产综合懂色| 人人妻人人澡欧美一区二区| 中文字幕精品亚洲无线码一区| 久久鲁丝午夜福利片| 久久韩国三级中文字幕| 国产激情偷乱视频一区二区| 噜噜噜噜噜久久久久久91| 99热这里只有是精品50| 变态另类丝袜制服| 级片在线观看| av又黄又爽大尺度在线免费看 | 久久亚洲精品不卡| 国产一级毛片在线| 国产蜜桃级精品一区二区三区| 免费看日本二区| 熟女电影av网| 能在线免费观看的黄片| 国产老妇女一区| av女优亚洲男人天堂| 国产午夜福利久久久久久| 在线免费十八禁| 1000部很黄的大片| 青春草国产在线视频 | 成年女人看的毛片在线观看| 亚洲熟妇中文字幕五十中出| 亚洲av中文字字幕乱码综合| 变态另类丝袜制服| 日日摸夜夜添夜夜添av毛片| 麻豆国产97在线/欧美|