• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Data Selection Using Support Vector Regression

    2015-02-24 03:39:21MichaelRICHMANLanceLESLIETheodoreTRAFALISandHichamMANSOURI
    Advances in Atmospheric Sciences 2015年3期
    關(guān)鍵詞:嗜酸肥大細胞收治

    Michael B.RICHMAN,Lance M.LESLIE,Theodore B.TRAFALIS,and Hicham MANSOURI

    1School of Meteorology and Cooperative Institute for Mesoscale Meteorological Studies, University of Oklahoma,Norman,Oklahoma,73072,USA

    2School of Industrial and Systems Engineering,University of Oklahoma,Norman,Oklahoma,73019,USA

    3Power Costs,Inc.,301David L.Boren Blvd.,Suite2000,Norman,Oklahoma73072,USA

    Data Selection Using Support Vector Regression

    Michael B.RICHMAN?1,Lance M.LESLIE1,Theodore B.TRAFALIS2,and Hicham MANSOURI3

    1School of Meteorology and Cooperative Institute for Mesoscale Meteorological Studies, University of Oklahoma,Norman,Oklahoma,73072,USA

    2School of Industrial and Systems Engineering,University of Oklahoma,Norman,Oklahoma,73019,USA

    3Power Costs,Inc.,301David L.Boren Blvd.,Suite2000,Norman,Oklahoma73072,USA

    Geophysical data sets are growing at an ever-increasing rate,requiring computationally effcient data selection(thinning) methods to preserve essential information.Satellites,such as WindSat,provide large data sets for assessing the accuracy and computational effciency of data selection techniques.A new data thinning technique,based on support vector regression (SVR),is developed and tested.To manage large on-line satellite data streams,observations from WindSat are formed into subsets byVoronoi tessellationandthen eachisthinned bySVR(TSVR).ThrExperiments areperformed.Thefrstconfrms the viability of TSVR for a relatively small sample,comparing it to several commonly used data thinning methods(random selection,averaging and Barnes fltering),producing a 10%thinning rate(90%data reduction),low mean absolute errors (MAE)and large correlations with the original data.A second experiment,using a larger dataset,shows TSVR retrievals with MAE<1 m s-1and correlations≥0.98.TSVR was an order of magnitude faster than the commonly used thinning methods.A third experiment applies a two-stage pipeline to TSVR,to accommodate online data.The pipeline subsets reconstruct the wind feld with the same accuracy as the second experiment,is an order of magnitude faster than the nonpipeline TSVR.Therefore,pipeline TSVR is two orders of magnitude faster than commonly used thinning methods that ingest the entire data set.This study demonstrates that TSVR pipeline thinning is an accurate and computationally effcient alternative to commonly used data selection techniques.

    dataselection,datathinning,machinelearning,support vector regression,Voronoi tessellation,pipelinemethods

    1. Introduction

    The quantity of geophysical data is increasing at a rapid rate.Hence,it is essential to identify and/or select features that preserve relevant information in the data.Data selection has as its two main aims the removal of redundant and faulty data.Here,the emphasis is on redundant data,so the terms data selection and data thinning will be used interchangeably. Redundant data arise from two main sources:when the data density is greater than the spatial and temporal resolution of the analysis grid and when the data are not linearly independent.Penalties for retaining redundant data are the(possibly massive)increase in computational cost,the failure to satisfy key assumptions of the data analysis scheme(Lorenc,1981) andtheincreasedriskofoverftting(particularlyforproblems with high dimensions).

    The need for data selection is exemplifed by satellite ob-tors of observations to the data selection process and,hence, to the analysis.Notably,satellites providehigh-resolutionobservations over data poor regions,especially the oceans and sparsely populated land areas.Historically,data redundancy issues led to the development of data selection approaches that were simple and cost effective.These included:allocating the observations to geographical grid boxes and then averaging the data in each box to produce so-called superobservations,or“superobs”(Lorenc,1981;Purser et al., 2000);the selection of observations,in both meridional and zonal directions,with random sampling of the observations (Bondarenko et al.,2007);and the use of flters,such as the Barnesscheme(Barnes,1964).Owingtotheirsimplicity,and because they are non-adaptive,such strategies are referred to as unintelligent data selection techniques.For example,they do not specify targeted areas of interest or weight the data according to their contribution to minimizing differences between the thinned and non-thinned data.

    Recently,variousintelligentdata selection strategies have emerged(e.g.,Lazarus et al.,2010).Such approaches are effective in identifying and removing redundant data and haveother desirable features.One example is the Density Adjusted Data Thinning(DADT;Ochotta et al.,2005;2007), and its successor,the modifed DADT(mDADT;Lazarus et al.,2010).The intelligent data selection schemes are adaptive,as they attempt to retain those observations that are less highly correlated with other observations,but contribute more signifcantly to the retention of the information content in the observations(e.g.,theyemploy metrics based on gradients and/or curvature of the felds).Intelligent data selection schemes usually require defnitions of redundancy measures, and their sampling strategies iteratively remove observations that fail to meet the metric threshold criteria.

    The present work develops an entirely different,kernelbased,intelligentdataselection techniqueusingSupportVector Machines(SVMs).SVMs require neithera priorispecifcation of metrics nor of thinningrates.SVMs are alternatives to artifcial neural networks,decision trees and Bayesian networks for classifcation and prediction tasks(Sch¨olkopf and Smola,2002)used in supervised learning,such as statistical classifcation and regression analysis.Although SVMs were introduced several decades ago(Vapnik,1982),they have been investigated extensively by the machine learning community only since the mid-1990s(Shawe-Taylor and Cristianini,2004).

    SVMs require solving a quadratic programming problem with linear constraints.Therefore,the speed of the algorithm is a function of the number of observations(data points)used during the training period.Hence,the SVM solution to problems comprised of numerous data points is computationally ineffcient.Several methods have been proposed to ameliorate this problem.Platt(1999)applied Sequential Minimal Optimization(SMO),to break the large quadratic programmingproblem into a series of smallest analytically solvable problems.A faster SMO SVM algorithm,advantageous for real-time or online prediction or classifcation for large scale problems,was suggested by Bottou and LeCun(2004). Musicant and Mangasarian(2000)applied a linear program SVM method to accommodate very large datasets.Bak?r et al.(2004)selectively removed data using probabilistic estimates,without modifyingthe location of the decision boundary.Other techniques used online training to reduce the impact of large data sets.Bottou and LeCun(2005)showed that performing a single epoch of an online algorithm converges to the solution of the learning problem.Laskov et al.(2006) developincremental SVM learning with the aim of providing a fast,numerically stable and robust implementation.Support Vector Regression(SVR)uses the kernel approach from SVM to replace the inner product in regression.It is discussed extensively by Smola and Sch¨olkopf(1998).SVM techniques have been applied to small-scale meteorological applications,such as rainfall and diagnostic analysis felds supporting tornado outbreaks.These include the studies of Son et al.(2005),Santosa et al.(2005),Trafalis et al.(2005), and in satellite data retrievals,by Wei and Roan(2012).The present study seeks to further enhance SVR in two respects: (1)by applying a Voronoi tessellation(Bowyer,1981)to reduce the size of the large observational data sets and,(2) adopting a pipeline methodology(Quinn,2004)to improve the computational effciency of the data selection scheme.

    In section 2,large-scale problems using satellite datasets are described.In section 3,it is shown how Voronoi tessellation reduces the size of the large observational data sets,and how a pipeline SVM methodologysubstantially enhances the computational effciency of the data selection scheme.The results are presented in section 4.Finally,conclusions are discussed in section 5.

    2. Data

    This study employs data from the WindSat microwave polarimetric radiometry sensor(Gaiser et al.,2004).Wind-Sat provides environmental data products,including latitude, longitude,cloud liquid water,column integrated precipitable water,rain rate,and sea surface temperature.WindSat measurements over the ocean are used operationally to generate analysis felds and also as input to numerical weather prediction models of the U.S.Navy,the U.S.National Oceanic and Atmospheric Administration(NOAA)and the United Kingdom Meteorological Offce.As a polarimetric radiometer, WindSat measures not only the principal polarizations(vertical and horizontal),but also the cross-correlation of the vertical and horizontal polarizations.The cross-correlation terms represent the third and fourth parameters of the modifed Stokes vector(Gaiser et al.,2004).The Stokes vector provides a full characterization of the electromagnetic signature of the ocean surface and the independent information needed to uniquely determine the wind direction(Chang et al.,1997).

    To illustrate the data selection procedure introduced herein,it suffces to explore a single data type,namely,sea surface wind(SSW)speeds and directions.For SSW data,it is necessary to account not only for random errors but also for spatially correlated errors.Typical ascending swaths for a 24 hour sample of WindSat data provide~1.5 million observations.Given this massive number of data points,oversampling of wind data can severely degrade the analysis and, consequently,the model forecasts.

    ThrExperimentswere carried out using differentWind-Sat datasets.The frst experiment was designed to assess,on a relatively small sample,the accuracy and computational effciency of a Voronoi tessellation followed by SVR to thin the WindSat data.Hereafter,this sequential combination of Voronoi tessellation followed by SVR will be referred to“TSVR”.Two hours of WindSat data from 1 January 2005 were chosen in the region 127°W to 145°E longitude and 23°to 42°N latitude,providing 13 540 observations for the data selection process.Additionally,TSVR was compared to three commonly used data thinning techniques(simple averaging,random selection and a Barnes flter)to assess the relative accuracy and computational effciency of each method. A second experiment used 226393 observations to determine if the accuracy and computational effciency gains by TSVR were preserved with a much larger dataset.The third experi-ment employs a pipeline methodology(section 3.3)as it has been employed successfully to achieve much higher computational effciency(e.g.,Ragothaman et al.,2014).Such an approach is expected to enhance real-time processing of an on-line stream of WindSat data.

    3. Learning Machine Methodologies

    3.1.Voronoi Tessellation

    Experiments show that the standard SVR algorithm loses computational effciency when analyzing more than several thousand observations(Platt,1999).Since the WindSat data sets used in this study are in excess of this,and can exceed 106observations,direct application of SVR is not feasible. Methods have been proposed to reduce this problem(e.g., Platt,1999;Musicant and Mangasarian,2000).Voronoi tessellation partitions a plane withppoints into convex polygons such that each polygon contains exactly one generating point and every point in a given polygon is closer to its generating point than to any other.The cells are called polytopes (e.g.,Voronoi polygons).They were employed by Voronoi (1908)and have been applied in diverse felds,such as computer graphics,epidemiology,geology,and meteorology.As shown in Fig.1,the tessellation is achieved by allocating the data points to a number of Voronoi cells(Du et al.,1999; Mansouri et al.,2007;Gilbert and Trafalis,2009;Helms and Hart,2013).The process uses the Matlab“voronoi”function (Matlab,2012).

    As mentioned above,for a discrete set,S,of points in Rnand for any pointx,there is one point ofSclosest tox.More formally,letXbeaspace(andSa nonemptysubset ofX)providedwithadistancefunction,d.LetC,anonemptysubsetofX,be a set ofpcentroids(Pc),c∈[1,p].The Voronoi cell,or Voronoi region,Vc,associated with the centroidPcis the set ofall pointsinXwhosedistancetoPcis notgreaterthantheir distance to the other centroids,Pj,wherejis any index different fromc.That is ifD(x,A)=inf{d(x,a)|a∈A}denotes the distance between the pointxand the subsetA,thenVc={x∈X|d(x,Pc)≤d(x,Pj),for allj/=c}.

    In general,the set of all points closer toPc,than to any other point ofS,is called the Voronoi cell forPc.The set of such polytopes is the Voronoi tessellation corresponding to the setS.In two dimensional space,a Voronoi tessellation can be represented as shown in Fig.1.Since the number of data points inside each Voronoi polygonis much less than for the full data set,the computational time is reduced greatly. Moreover,further effciency can be gained by using parallel computing,solvingasetofVoronoipolygonssimultaneously.

    3.2.Support Vector Regression

    In SVR,it is assumed that there is a data sourceproviding a sequence oflobservations and no distributional assumptions are made.Each observation(data point)is represented as a vector with a fnite numbernof continuous and/or discrete variables that can be denoted as a point in the Euclidean space,Rn.Hence,thelobservations are data points in the Euclidean space Rn.

    Thelobservations are divided intopcells using Voronoi tessellation.The methodology consists of making eachkth observation a seed or“centroid”for a Voronoi cellVc,?c∈ [1,p].The parameterkis set such thatHence,for a largerk,fewer cells will be generated.Each cellVcwill be composed of data points represented byxi,c∈Rn,?i∈[1,l]. In regression problems,each observationxi,cis related to a unique real valued scalar target denoted byyi,c.The couplets (xi,c,yi,c)in Rn+1are a set of points that have a continuous unknown shape that is not assumed to follow a known distribution.The objective of support vector regression(SVR)is to fnd a machine learning prediction function(in our application,this is an estimation at a particular time,t,rather than a forecast at timet+?t),denoted byfcfor each cellVcsuch thatthedifferencesbetweenfc(xi,c)andthetargetvalues,yi,c, are minimized.

    In the present study,the target is either theu-or thevcomponent of the winds.By introducing,for each observationxi,c,a set of positive slack variables,ξi,c,which are minimized,the following set of constraints for the regression problems are generated for each cellVc:

    For linear regression,in the SVM literature,fcbelongs to a class of functions denoted byF,such that:

    wherebcis the bias term,Bc>0 is a constant that boundsthe weight space,wc=αj,c,xj,c,andαj,c∈R?j∈[1,l].

    In the case of nonlinear regression,the class of functions,F,is changedto allowforlinearregressionin Hilbertspaceto where the observationsxi,cwill be mapped.This is achieved by introducing a nonnegative defnite kernelk:Rn×Rn→R,toinducea newHilbertspaceHandmap?:Rn→Hsuchthatk(x,y)=〈?(x),?(y)〉Hfor anyxandyin Rn.Hence,Fbecomes:

    wherewc=αj,c?(xj,c),andαj,c∈R?j∈[1,l].Explicit knowledge ofHand?is not required.Therefore,the set of constraints Eq.(1)becomes:

    SVM allows for an objective function that reduces the slack variables and the expected value of|fc(xi,c)-yi,c|.To achieve that objective,minimize the quantitiesbc,ξi,c,and‖wc‖H.

    whereC>0 is a positive trade-off constant that penalizes the non-zero values of theξi,c,1is al×1 vector of ones,andycis the vector with elementsyi,c.

    The optimal solution()of Eq.(5)yields the following prediction function:

    The vectors forxi,cwhich the values ofαi,care nonzero are called support vectors.

    From Eq.(3)a kernel is required.In this work,several kernelswere tested fortheir ability to select a smaller number of observations with a minimum loss of information.Those tested were:

    the linear kernel,

    the radial basis function kernel(RBF),

    the polynomial kernel of degreeq,

    and the sigmoidal kernel,

    whereσ,g,aandθare scaling constants.

    3.3.Pipeline TSVR

    To improve the effciency of the TSVR,a pipeline methodology(Quinn,2004)is introduced to allow for an online stream of meteorological satellite data.The pipeline approach is appropriate for such data because the satellite samples a swath of new wind data as it orbits.Within each Voronoi polygon,the pipeline is applied to the variables used to estimate the winds by TSVR.A two-stage pipeline(with 50%overlap as shown in Fig.2)is applied that fetches and preprocesses new data as old data is executing in the CPU. Figure 2 illustrates the pipeline,showing that the orbital swath is divided into discrete steps and how these new data are incorporated into the TSVR process.Figure 2 shows the pipeline window of width of four CPU time units,ingesting the data set.At each step,the most recent data are included in the window,while the oldest data are released.Next,the window moves to the right by one-half step.Hence,instead of thinning all the data within a window,the cells outside the window are dropped and new Voronoi cells are formed that contain only the new data.If this overlapping approach were not adopted,the data would have to be ingested,preprocessed and analyzed prior to moving on to the next batch of data,thereby reducing the effciency of the process.

    3.4.Measures of differences between non-thinned and thinned data

    Mean squared differences(commonly referred to as MSE),mean absolute differences(MAE),as well as the correlationbetween the original(non-thinned)and thinnedsatellite observed winds are employed to measure the quality of the thinned observations.MSE,MAE and correlations are defned in Wilks(2011).These are commonly applied metrics to measure differences between two felds.

    4. Results

    4.1.Results of the frst experiment

    Table 1.MAE,MSE and correlation metrics comparing the differences between observed and thinnedu-andv-components of wind for different SVR kernels.The kernel selected(RBF 1)is in bold font.

    The main objective of this experiment is to assess the feasibility of the TSVR,and to determine the most effective kernel,using a small sample(13 540 observations)of WindSat data.Support vectors are used for the reproduction of the wind feld after data selection.Because of the intelligent adaptive capability of the TSVR,fewer than 8% of the observed satellite data were needed to reconstruct the wind feld.To quantify the accuracy of the reconstructed winds using TSVR,the thinned winds are compared to the non-thinned observations.From Eq.(3),a kernel must be selected to generate the support vectors and reconstructs the wind felds.Table 1 shows metrics(MSE,MAE and correlations)for the kernels defned in section 3.1.The various kernels tested were:linear;seven radial basis functions with theσparameter varying from 0.5 to 100;polynomials withg=1 and of orders(q)2 and 3;and sigmoidal with the two scale parameters(a,θ)set to 1.The smallest differences between thinned and non-thinned wind data were obtained for the RBF kernel,with au-component MAE(MSE)of 1.05 m s-1(5.99 m2s-2),which are 44%(53%)reductions in the discrepancies,respectively,obtained from any non-RBF kernel.For thev-component,the corresponding reductions for the RBF kernel,compared to a non-RBF kernel,were even larger at 63%(65%).The variances explained(correlations squared)are 82.8%and 96.0%for theu-andv-components, representing improvements of 33%and 6%,respectively, over any non-RBF kernel.Therefore,the RBF kernel with parameter 1 is used for all subsequent TSVR analyses.

    Figure 3 shows frequency counts of the reconstructed wind errors for the 13540 observations thinned by TSVR. For theu-component(Fig.3a),77%(87%)of the discrepancies of the magnitudes are≤1 m s-1(2 m s-1),which is at or below the accepted observation error for these data (Quilfen et al.,2007).Similar discrepancies were found for thev-component(Fig.3b).Both distributions are highly leptokurtic,illustrating the effcacy of TSVR.Figure 4 presents the thinned(Figs.4a,c)andnon-thinned(Figs.4b,d)satellite wind feld contours for theu-andv-components.The close spatial correspondence of the patterns for each component is consistent with the large positive correlations in Table 1 for the RBF 1 kernel.

    For the present problem,most of the support vectors have alphavaluesnearzero(Fig.5),thustheyhaveaninsignifcant contribution to the fnal solution.From Eq.(6),those support vectors with zero or near-zero alpha values are ignored,providing further data reduction.For the present analysis,Figure 5 illustrates the large data reduction capability of SVR for these data.From the available 13540 data points,only~1000 support vectors(<8%)are required to reconstruct the wind vector feld with the aforementioned high level of accuracy.Specifcally,foreach Voronoicell,the satellite data points inside each cell are used to train the SVR.Fewer than 8%ofthe observationswere supportvectorsand are retained; therefore,the thinningrate is>92%.The<8%support vectors had an MAE of 0,the MAE of the other>92%datapoints was calculated using only<8%of the support vectors.Since the percentage of support vectors is a function of the complexity of the data feld,it will vary according to the spatial and temporal data structure.

    Given the large data reduction and high level of accuracy in reproducing the wind felds provided by TSVR,as found in section 4.1,a considerably larger sample(226393 data points)was drawn to assess the scalability of TSVR and to compare it to several commonly used data thinning techniques.For these commonly used techniques,the observations were assigned to cells ofhdegrees latitude and longitude.For random sampling,a single observation was selected.For the other schemes,all data were used.The accuracy of these data selection methods is shown in Figs.6a-d (MAE,MSE)and Fig.7(correlation).The MAE for theucomponent(Fig.6a)shows that,as the width of the data cells decreases,the discrepancies decrease for both averaging and random selection.The accuracy of Barnes fltering improves as the cells decrease in size and reaches a minimum at a cell width of approximately 0.7 degrees;beyond that,insuffcient data density produces increasingly inaccurate results. As the Voronoi tessellation is applied to TSVR,the cells do not change and hence the accuracy remains constant.For thev-component(Fig.6b),similar behavior is noted for all techniques.TSVR is the most accurate thinning technique with MAE~0.5 m s-1.The MSE values(Figs.6c,d)are larger than the corresponding MAE values;however,the ranking of the techniques remains the same,with the random sampling being least accurate,averaging and Barnes giving similar results and the TSVR producing the most accurate thinning.The correlation between the thinned and non-thinned winds is calculated for the same data selection methods(Fig. 7).As the cell width decreases,the correlations for the ucomponents,given by the three commonly used techniques move closer to the TSVR value,but never exceed it.Despite these large correlations at small cell widths,the larger MAE andMSEofthethreecommonlyusedtechniquesindicateless accurate thinning for those methods.Thev-component correlations for the other methods are considerably lower than those for TSVR(Fig.7).Moreover,the high correlations obtained with the three commonly used data selection methods is achieved at the expense of a loss of computational effciency(Fig.8),as the TSVR requires approximately250 secondsto thinthese dataat the aforementionedaccuracy(correlation of 0.99 and 0.98 for the TSVR)versus over 1000 seconds for the other three techniques.For this experiment,the percentage of data required to obtain this level of accuracy for the TSVR is~10%.In comparison,the thinning rates of the three commonly used methods,to achieve accuracy closeto that of the TSVR,is much larger(~26%).

    4.3.Results of the third experiment

    Using TSVR,computation times can be decreased by buffering in a series of subsets of data and calculating the support vectors of each sample.This process is known as pipeline thinning(Fig.2).To investigate the gain in computational effciency of the pipeline approach,compared to TSVR withoutapipeline,a sampleof120983datapointswas drawn from the 1.5 million observations.The results for the regular and pipeline TSVR are very similar,with MAE magnitude differences(Fig.9a,b)of≤0.05 m s-1and the MSE differences of≤0.1 m2s-2(Fig.9c,d).The correlations between the reconstructedand observedwinds for the regular versus pipeline methods(Fig.9e,f)show trivial differences in the second decimal point,at most.It is notable that the correlations for theu-componentare,for both the regular and pipeline methods,~0.97(Fig.9e)and,for thev-component,~0.99(Fig.9f),indicating the very close correspondence between the thinned and the non-thinned data.The computation time for the pipeline TSVR is less than that for the regular TSVR.The computational effciency gain arises as, for the frst CPU time step(Fig.10;t=1),all the data within the window are thinned;however,fort>1,using pipeline TSVR,only the new data are thinned.For both the pipeline and non-pipeline TSVR approaches,the time needed to thin the data for the frst period was~145 seconds.However,for periods 2–13,the average thinning time was~142 seconds for the regular TSVR,decreasing by an order of magnitude to 13 seconds for the pipeline TSVR approach(Fig.10). Therefore,the pipeline TSVR approach requires just 9%of the time of the non-pipeline TSVR method,while providingalmost identical accuracy.

    5. Conclusions

    The removal of redundant data is commonly known as data thinning.In this study,the application is the thinning ofu-andv-components of the winds estimated from Wind-Sat.The numberof observationsis reduced througha combination of Voronoi tessellation and support vector regression (TSVR).Here,hundreds of thousands of observations are assigned to several thousand Voronoicells to optimize the wind retrieval accuracy.For each cell,separate TSVR analyses were conducted,for theu-andv-components of the winds. The number of Voronoi cells can be adapted,consistent with the complexity of the feld,by increasing or decreasing their number.The process can be extremely effcient if the process is parallelized by assigning the SVR calculation inside each Voronoi cell to a separate CPU.

    The results of the thinning experimentsyielded decidedly encouragingresults.TheTSVR requiresfewerthan8%–10% of the WindSat data to produce a highly accurate estimate of thewind feld(MAE<1 ms-1andthe correlation≥+0.98). In comparison,commonly used techniques,such as random selection,averaging and a Barnes flter,are computationally effcient,but have poor retrieval accuracy at coarse spatial resolution.However,at high spatial resolution,as the accuracy of the three commonly used techniques approaches that of TSVR,the computational times for the other thinning methods exceed those of the TSVR approach by a factor of~4.

    High retrieval accuracy is a requirement for meaningful analysis.Of the thinning techniques examined,only TSVR offers this combination of providing extremely high retrieval accuracy with the shortest clock time.To determine whether the computational effciency of the TSVR approach could be improved further,a pipeline thinning methodology was applied to the TSVR,reducing the clock time from 150 to 15 seconds.Therefore,for any application requiring ingesting and preprocessing online data,followed by thinning,the pipeline TSVR methodology is advantageous.In this study, it is not only the most accurate of all methods tested but is also the fastest,by up to two orders of magnitude.

    Acknowledgements.The authors wish to acknowledge NOAA Grant NA17RJ1227 and NSF Grant EIA-0205628 for providing fnancial support for this work.The third author was partly supported by RSF Grant 14-41-00039.The opinions expressed herein are those of the authors and not necessarily those of NOAA or NSF.The authors wish also to thank Kevin HAGHI,Andrew MERCER and Chad SHAFER for their assistance with several of the fgures.

    REFERENCES

    Bak?r,G.H.,L.Bottou,and J.Weston,2004:Breaking SVM complexity with cross-training.In L.K.Saul,Y.Weiss,and L. Bottou,editors,Advances in Neural Information Processing Systems 17,MIT Press,81–88.

    支氣管哮喘是由嗜酸粒細胞、T淋巴細胞、肥大細胞等多種炎癥細胞介入的氣道慢性炎癥,簡稱哮喘。患者的臨床表現(xiàn)包括胸悶、咳嗽、呼氣性呼吸困難、反復(fù)喘息等,多于清晨或夜間發(fā)作,經(jīng)治療后緩解[1]。本文選取我院2016年5月~2017年5月之間收治的20例支氣管哮喘患者,對其臨床治療觀察分析如下。

    Barnes,S.L,1964:A technique for maximizing details in numerical weather-map analysis.Journal of Applied Meteorology,3, 396–409.

    Bondarenko,V.,T.Ochotta,and D.Saupe,2007:The interactionbetween model resolution,observation resolution and observations density in data assimilation:A two-dimensional study.Preprints,11th Symp.On Integrated Observing and Assimilation Systems for the Atmosphere,Oceans,and Land Surface,SanAntonio,TX,Amer.Meteor.Soc.,P5.19.[Available online at http://ams.confex.com/ams/pdfpapers/117655. pdf.]

    Bottou L.,and Y.LeCun,2004:On-line learning for very large datasets.Applied Stochastic Models in Business and Industry, 21,137–151.

    Bowyer,A.,1981:Computing Dirichlet tessellations.Comput.J., 24,162–166.

    Chang,P.,P.Gaiser,K.St.Germain,and L.Li,1997:Multi-Frequency Polarimetric Microwave Ocean Wind Direction Retrievals.Proceedings of the International Geoscience and Remote Sensing Symposium 1997,Singapore.[Available online at http://w.nrl.navy.mil/research/nrl-review/2004/ featured-research/gaiser/#sthash.IskB3x9l.dpuf.]

    Du Q.,V.Faber,and M.Gunzburger,1999:Centroidal Voronoi tessellations:applications and algorithms.SIAM Review,41, 637–676.

    Gaiser,P.W.,K.M.St.German,E.M.Twarog,G.A.Poe,W. Purdy,D.Richardson,W.Grossman,W.L.Jones,D.Spencer, G.Golba,J.Cleveland,L.Choy,R.M.Bevilacqua,and P. S.Chang,2004:The WindSat space borne polarimetric microwave radiometer:Sensor description and early orbit performance.IE Trans.on Geosci.and Remote Sensing,42, 2347–2361.

    Gilbert,R.C.,and T.B.Trafalis,2009:Quadratic programming formulations for classifcation and regression.Optimization Methods and Software,24,175–185.

    Helms,C.N.,and R.E.Hart,2013:A polygon-based line-integral method for calculating vorticity,divergence,and deformation from nonuniform observations.J.Appl.Meteor.Climatol.,52, 1511–1521.

    Laskov,P.,C.Gehl,S.Kr¨uger,and K.-R.M¨uller,2006:Incremental support vector learning:Analysis,implementation and applications.Journal of Machine Learning Research,7,1909–1936.

    Lazarus,S.M.,M.E.Splitt,M.D.Lueken,R.Ramachandran,X. Li,S.Movva,S.J.Graves,and B.T.Zavodsky,2010:Evaluation of data reduction algorithms for real-time analysis.Wea. Forecasting,25,511–525.

    Lorenc,A.C.,1981:A three-dimensional multivariate statisticalinterpolation scheme.Mon.Wea.Rev.,109,1177–1194.

    Mansouri,H.,R.C.Gilbert,T.B.Trafalis,L.M.Leslie,and M.B. Richman,2007:Ocean surface wind vector forecasting using support vector regression.In C.H.Dagli,A.L.Buczak,D. L.Enke,M.J.Embrechts,and O.Ersoy,editors,Intelligent Engineering Systems Through Artifcial Neural Networks,17, 333–338.

    MATLAB,2012:MATLABand StatisticsToolbox Release2012b, The MathWorks,Inc.,Natick,Massachusetts,United States. [Available online at http://nf.nci.org.au/facilities/software/ Matlab/techdoc/ref/voronoi.html.]

    Musicant D.R.,and O.L.Mangasarian,2000:Large scale kernel regression via linear programming.Machine Learning,46, 255–269.

    Ochotta,T.,C.Gebhardt,D.Saupe,and W.Wergen,2005:Adaptive thinning of atmospheric observations in data assimilation with vector quantization and fltering methods.Quart.J. Royal Meteorol.Soc.,131,3427–3437.

    Ochotta,T.,C.Gebhardt,V.Bondarenko,D.Saupe,and W.Wergen,2007:On thinning methods for data assimilation of satellite observations.Preprints,23rd Int.Conf.on InteractiveInformation ProcessingSystems(IIPS),SanAntonio,TX, Amer.Meteor.Soc.,2B.3.[Available online at http://ams. confex.com/ams/pdfpapers/118511.pdf.]

    Platt,J,1999:Using sparseness and analytic QP to speed training of support vector machines.In M.S.Kearns,S.A.Solla,and D.A.Cohn,editors,Advances in Neural Information Processing Systems11,MIT Press,557–563.

    Purser,R.J.,D.F.Parrish,and M.Masutani,2000:Meteorological observational data compression:An alternative to conventional“super-obbing”.NCEP Offce Note430,12pp.[Available online at http://w.emc.ncep.noaa.gov/mmb/papers/ purser/on430.pdf.]

    Quilfen,Y.,C.Prigent,B.Chapron,A.A.Mouche,and N.Houti, 2007:The potential of QuikSCAT and WindSat observations for the estimation of sea surface wind vector under severe weather conditions,J.Geophys.Res.Oceans,112,49–66.

    Quinn,M.J.2004:Parallel Programming in C with MPI andopenMP.Dubuque,Iowa:McGraw-Hill Professional,544pp.

    Ragothaman,A.,S.C.Boddu,N.Kim,W.Feinstein,M.Brylinski,S.Jha,and J.Kim,2014:Developing ethread pipeline using saga-pilot abstraction for large-scale structural bioinformatics.BioMed Research International,2014.1–12,doi: 10.1155/2014/348725.

    Santosa,B.,M.B.Richman,and T.B.Trafalis,2005:Variable selection and prediction of rainfall from WSR-88D radar using support vector regression.Proceedings of the6th WSEAS Transactions on Systems,4,406–411.

    Sch¨olkopf,B.,and A.Smola,2002:Learning with Kernels.MIT Press,650pp.

    Smola,A.J.,andB.Sch¨olkopf,1998:ATutorialon Support Vector RegressionRoyal Holloway College,NeuroCOLT Technical Report(NC-TR-98-030),University of London,UK.[Available online at http://svms.org/tutorials/SmolaScholkopf1998. pdf.]

    Shawe-Taylor,J.,and N.Cristianini,2004:Kernel Methods for Pattern Analysis.Cambridge University Press,478pp.

    Son,H-J,T.B.Trafalis,and M.B.Richman,2005:Determination of the optimal batch size in incremental approaches:An application to tornado detection,Proceedings of International Joint Conference on Neural Networks,IE,2706–2710.

    Trafalis,T.B.,B.Santosa,and M.B.Richman,2005:Feature selection with linear programming support vector machines and applications to tornado prediction,WSEAS Transactions on Computers,4,865–873.

    Vapnik,V.,1982:Estimation of Dependences Based on Empirical Data.Springer,505pp.

    Voronoi,G.,1908:Recherches sur les parall′elo`edres Primitives.J. Reine Angew.Math.134,198–287(in French).

    Wei,C.-C.,and J.Roan,2012:Retrievals for the rainfall rate over land using special sensor microwave imager data during tropical cyclones:Comparisons of scattering index,regression,and support vector regression.J.Hydrometeor,13, 1567–1578.

    Wilks,D.S.,2011:Statistical Methods in the Atmospheric Sciences.3rd ed.,Elsevier,676 pp.

    :Richman,M.B.,L.M.Leslie,T.B.Trafalis,and H.Mansouri,2015:Dataselectionusingsupport vector regression.Adv.Atmos.Sci.,32(3),277–286,

    10.1007/s00376-014-4072-9.

    (Received 17 April 2014;revised 11 September 2014;accepted 18 September 2014)

    ?Corresponding author:Michael B.RICHMAN

    Email:mrichman@ou.edu

    servations.Satellites are among the most important contribu-

    猜你喜歡
    嗜酸肥大細胞收治
    MRI平掃在腎臟嗜酸細胞瘤與嫌色細胞癌鑒別診斷中的價值
    新型冠狀病毒肺炎定點收治醫(yī)院應(yīng)急病房籌建策略
    新型冠狀病毒肺炎流行期間急腹癥患者收治與防控體會
    寧夏定點醫(yī)院收治68例確診新型冠狀病毒感染肺炎患者臨床癥狀分析
    同側(cè)乳腺嗜酸細胞癌伴導(dǎo)管原位癌1例并文獻復(fù)習(xí)
    全院病床統(tǒng)籌收治模式下的績效核算方法初探
    《大鼠及小鼠原代肥大細胞表面唾液酸受體的表達》圖版
    誤診為嗜酸粒細胞增多癥1例分析
    肥大細胞在抗感染免疫作用中的研究進展
    肥大細胞與腎臟疾病
    高清av免费在线| 欧美高清性xxxxhd video| 亚洲精华国产精华液的使用体验| 国产成人精品久久久久久| 免费播放大片免费观看视频在线观看| 女人十人毛片免费观看3o分钟| 国产乱来视频区| 久久精品熟女亚洲av麻豆精品 | 五月玫瑰六月丁香| 国产精品熟女久久久久浪| 中文字幕免费在线视频6| 色5月婷婷丁香| 精品人妻偷拍中文字幕| 欧美另类一区| 欧美xxxx性猛交bbbb| 午夜精品一区二区三区免费看| 免费大片黄手机在线观看| 亚洲精品乱码久久久久久按摩| 亚洲av福利一区| 大片免费播放器 马上看| 亚洲久久久久久中文字幕| 亚洲最大成人中文| 精品久久久久久电影网| 最后的刺客免费高清国语| 中文字幕亚洲精品专区| 一个人看的www免费观看视频| 国产一区二区在线观看日韩| av在线播放精品| 日韩欧美精品v在线| 亚洲美女视频黄频| 欧美成人a在线观看| 蜜桃久久精品国产亚洲av| 十八禁网站网址无遮挡 | 久久精品夜夜夜夜夜久久蜜豆| 3wmmmm亚洲av在线观看| av又黄又爽大尺度在线免费看| 又粗又硬又长又爽又黄的视频| 成人特级av手机在线观看| 久久久久久国产a免费观看| 热99在线观看视频| 国产精品综合久久久久久久免费| 国产极品天堂在线| 蜜臀久久99精品久久宅男| 日韩精品有码人妻一区| 免费av毛片视频| av免费在线看不卡| 国产精品日韩av在线免费观看| 久久综合国产亚洲精品| 日日干狠狠操夜夜爽| videossex国产| 插阴视频在线观看视频| 好男人在线观看高清免费视频| 99久国产av精品| 看黄色毛片网站| 午夜福利高清视频| 91久久精品国产一区二区三区| 麻豆成人午夜福利视频| kizo精华| 五月伊人婷婷丁香| 一本久久精品| 亚洲丝袜综合中文字幕| 国产亚洲5aaaaa淫片| 国产亚洲最大av| 国产麻豆成人av免费视频| 街头女战士在线观看网站| 国产成人免费观看mmmm| 国产黄色视频一区二区在线观看| 尾随美女入室| 国产一区二区在线观看日韩| 日韩成人av中文字幕在线观看| 好男人视频免费观看在线| 精品久久久久久久久av| 日韩欧美三级三区| eeuss影院久久| 日韩人妻高清精品专区| 国产精品.久久久| 亚洲性久久影院| 欧美+日韩+精品| 国产精品一区二区三区四区免费观看| 夫妻性生交免费视频一级片| 寂寞人妻少妇视频99o| 亚洲精品日韩在线中文字幕| 又大又黄又爽视频免费| 亚洲自偷自拍三级| 舔av片在线| 六月丁香七月| 精品欧美国产一区二区三| 一级av片app| 国产在线男女| 亚洲成人中文字幕在线播放| 午夜福利视频1000在线观看| 夜夜爽夜夜爽视频| 国产亚洲5aaaaa淫片| 午夜亚洲福利在线播放| 欧美极品一区二区三区四区| 肉色欧美久久久久久久蜜桃 | av在线播放精品| 别揉我奶头 嗯啊视频| av福利片在线观看| 欧美精品国产亚洲| 三级毛片av免费| 欧美高清成人免费视频www| 2022亚洲国产成人精品| 欧美性猛交╳xxx乱大交人| 久久精品久久久久久久性| 久久精品国产亚洲av涩爱| 免费播放大片免费观看视频在线观看| 久久久久网色| 日韩国内少妇激情av| 最新中文字幕久久久久| 高清欧美精品videossex| 欧美一级a爱片免费观看看| 国产乱人视频| 青春草亚洲视频在线观看| 超碰97精品在线观看| 少妇高潮的动态图| 亚洲熟妇中文字幕五十中出| 99热这里只有是精品在线观看| 国产伦一二天堂av在线观看| 久久久久久伊人网av| 亚洲最大成人手机在线| 国产在线男女| 日本wwww免费看| 一区二区三区乱码不卡18| av在线老鸭窝| 蜜臀久久99精品久久宅男| 一区二区三区四区激情视频| 天堂网av新在线| 在线观看av片永久免费下载| 久久97久久精品| 亚洲图色成人| 在线观看av片永久免费下载| 午夜福利视频精品| 久久久精品免费免费高清| 街头女战士在线观看网站| 久久韩国三级中文字幕| 精品国内亚洲2022精品成人| 美女大奶头视频| 亚洲国产最新在线播放| 小蜜桃在线观看免费完整版高清| 免费播放大片免费观看视频在线观看| 国产人妻一区二区三区在| 亚洲欧美成人精品一区二区| 我的女老师完整版在线观看| 国产片特级美女逼逼视频| 国产伦理片在线播放av一区| 亚洲伊人久久精品综合| 午夜激情欧美在线| av天堂中文字幕网| or卡值多少钱| 国产成人freesex在线| 欧美日韩在线观看h| 色播亚洲综合网| 久久久a久久爽久久v久久| 亚洲成色77777| 爱豆传媒免费全集在线观看| 蜜桃亚洲精品一区二区三区| 日本黄大片高清| 欧美一区二区亚洲| av在线老鸭窝| 亚洲成人中文字幕在线播放| 亚洲最大成人手机在线| 九草在线视频观看| 亚洲欧洲日产国产| 中文字幕亚洲精品专区| 91午夜精品亚洲一区二区三区| 一区二区三区四区激情视频| 亚洲在线观看片| 日本与韩国留学比较| 毛片一级片免费看久久久久| 精品久久久噜噜| 日日啪夜夜撸| 成人无遮挡网站| 成人欧美大片| 日日撸夜夜添| 国产一级毛片在线| 国产乱人视频| 亚洲综合精品二区| 国产又色又爽无遮挡免| 日本wwww免费看| 亚洲成人久久爱视频| 国产毛片a区久久久久| xxx大片免费视频| 人妻系列 视频| 老司机影院毛片| 久久久久久国产a免费观看| 特大巨黑吊av在线直播| 国内少妇人妻偷人精品xxx网站| 亚洲精品久久午夜乱码| 久久精品熟女亚洲av麻豆精品 | 亚洲人与动物交配视频| 久久久久久久久久人人人人人人| 国产精品精品国产色婷婷| 又爽又黄a免费视频| 日韩av在线免费看完整版不卡| 久久97久久精品| 日韩视频在线欧美| 亚洲自拍偷在线| 禁无遮挡网站| 欧美激情久久久久久爽电影| 九色成人免费人妻av| 国产精品伦人一区二区| 午夜福利在线观看免费完整高清在| 激情 狠狠 欧美| 99久久人妻综合| 国产爱豆传媒在线观看| 日日干狠狠操夜夜爽| 精品一区二区三卡| 亚洲综合色惰| 亚洲精品456在线播放app| 日本黄大片高清| 久久精品久久精品一区二区三区| 18+在线观看网站| 亚洲怡红院男人天堂| 国产成人aa在线观看| 国产精品1区2区在线观看.| 久久精品综合一区二区三区| 亚洲图色成人| 美女主播在线视频| 免费黄色在线免费观看| h日本视频在线播放| 亚洲最大成人av| 国产久久久一区二区三区| 精品人妻偷拍中文字幕| 日本欧美国产在线视频| 看免费成人av毛片| 欧美另类一区| 国产免费又黄又爽又色| 日韩不卡一区二区三区视频在线| 久久99热6这里只有精品| 亚洲国产高清在线一区二区三| 午夜亚洲福利在线播放| 免费人成在线观看视频色| 日韩精品有码人妻一区| 国产精品国产三级国产专区5o| 国产熟女欧美一区二区| 91精品伊人久久大香线蕉| 看黄色毛片网站| 99久久中文字幕三级久久日本| 婷婷色av中文字幕| 2022亚洲国产成人精品| 亚洲成人久久爱视频| 精品国产一区二区三区久久久樱花 | 麻豆av噜噜一区二区三区| 免费观看av网站的网址| 国产精品久久久久久av不卡| 伦精品一区二区三区| 国产欧美另类精品又又久久亚洲欧美| 伊人久久国产一区二区| 欧美日韩在线观看h| 亚洲国产色片| av天堂中文字幕网| 丝袜美腿在线中文| 免费看光身美女| 国产国拍精品亚洲av在线观看| 午夜精品国产一区二区电影 | 婷婷六月久久综合丁香| 中文字幕亚洲精品专区| 亚洲国产精品sss在线观看| 能在线免费观看的黄片| 我的老师免费观看完整版| 性插视频无遮挡在线免费观看| 18+在线观看网站| 国产成年人精品一区二区| 欧美 日韩 精品 国产| 日韩欧美 国产精品| 天天躁日日操中文字幕| 天天一区二区日本电影三级| 青春草亚洲视频在线观看| 精品一区二区三卡| xxx大片免费视频| av线在线观看网站| 日本与韩国留学比较| 国产淫片久久久久久久久| 三级经典国产精品| 亚洲国产精品成人综合色| 久久久久免费精品人妻一区二区| or卡值多少钱| 国产精品人妻久久久久久| 午夜激情久久久久久久| 久久久a久久爽久久v久久| 最新中文字幕久久久久| 成人亚洲精品av一区二区| av在线观看视频网站免费| 国产亚洲精品久久久com| 网址你懂的国产日韩在线| 嫩草影院精品99| 国产成人精品一,二区| 最新中文字幕久久久久| 男人狂女人下面高潮的视频| 日韩欧美精品v在线| 少妇人妻精品综合一区二区| 久久精品人妻少妇| 成人av在线播放网站| 亚洲国产高清在线一区二区三| 少妇猛男粗大的猛烈进出视频 | 国产极品天堂在线| 99热6这里只有精品| 亚洲欧美精品自产自拍| 日本一二三区视频观看| 最近中文字幕2019免费版| 久久99热这里只频精品6学生| 精品国内亚洲2022精品成人| 一本—道久久a久久精品蜜桃钙片 精品乱码久久久久久99久播 | 精品亚洲乱码少妇综合久久| 99久久精品热视频| 亚洲久久久久久中文字幕| 亚洲熟女精品中文字幕| 国产精品国产三级国产专区5o| 久久久久久九九精品二区国产| 国产亚洲av嫩草精品影院| 久久久久久久久久久免费av| 日韩制服骚丝袜av| 精品99又大又爽又粗少妇毛片| av在线观看视频网站免费| 亚洲人与动物交配视频| 亚洲精品视频女| 在线 av 中文字幕| 国产高潮美女av| 午夜免费激情av| av福利片在线观看| 一级毛片我不卡| .国产精品久久| 久久韩国三级中文字幕| 亚洲经典国产精华液单| 久久午夜福利片| 国产毛片a区久久久久| 日韩成人av中文字幕在线观看| 97超碰精品成人国产| 午夜激情久久久久久久| 欧美极品一区二区三区四区| 精品熟女少妇av免费看| 九色成人免费人妻av| 午夜激情福利司机影院| 免费黄网站久久成人精品| 久久久久久久久久久免费av| www.色视频.com| 婷婷色综合www| 欧美一级a爱片免费观看看| 一级毛片 在线播放| 九九爱精品视频在线观看| 看非洲黑人一级黄片| 韩国高清视频一区二区三区| 日韩精品青青久久久久久| 精品人妻熟女av久视频| 国产精品1区2区在线观看.| 成人午夜高清在线视频| 人妻制服诱惑在线中文字幕| 卡戴珊不雅视频在线播放| 欧美高清成人免费视频www| 色综合站精品国产| 久久久精品免费免费高清| 国产黄a三级三级三级人| 欧美97在线视频| 亚洲精品乱久久久久久| 精品一区二区三卡| 久久久久久国产a免费观看| 成年女人在线观看亚洲视频 | 大香蕉97超碰在线| 亚洲欧美一区二区三区黑人 | 精品人妻偷拍中文字幕| 亚洲精品成人av观看孕妇| 亚洲av国产av综合av卡| 国产亚洲一区二区精品| 亚洲成人中文字幕在线播放| 成年av动漫网址| 国产不卡一卡二| 极品教师在线视频| 国产 一区精品| 美女高潮的动态| 国产一区二区三区av在线| 女人被狂操c到高潮| 国产av在哪里看| 久久99热6这里只有精品| 亚洲精品久久久久久婷婷小说| 狂野欧美白嫩少妇大欣赏| 国产免费一级a男人的天堂| 在现免费观看毛片| 午夜福利网站1000一区二区三区| 卡戴珊不雅视频在线播放| 欧美变态另类bdsm刘玥| 日韩伦理黄色片| 嫩草影院精品99| 色视频www国产| 国产精品人妻久久久久久| 久久精品夜夜夜夜夜久久蜜豆| 国产精品久久视频播放| 联通29元200g的流量卡| 麻豆国产97在线/欧美| 亚洲av中文字字幕乱码综合| 国产精品女同一区二区软件| 亚洲欧美日韩东京热| 久久99热这里只频精品6学生| 人妻制服诱惑在线中文字幕| 久久午夜福利片| 熟女电影av网| 五月天丁香电影| 午夜福利在线观看免费完整高清在| 久久国内精品自在自线图片| 国产永久视频网站| 麻豆乱淫一区二区| 亚洲自拍偷在线| 特大巨黑吊av在线直播| 搡女人真爽免费视频火全软件| 晚上一个人看的免费电影| 日本午夜av视频| 女的被弄到高潮叫床怎么办| 亚洲欧洲国产日韩| 老师上课跳d突然被开到最大视频| 亚州av有码| 日韩电影二区| av在线天堂中文字幕| 亚洲国产日韩欧美精品在线观看| 两个人视频免费观看高清| 岛国毛片在线播放| 亚洲va在线va天堂va国产| av.在线天堂| 成人av在线播放网站| 亚洲精品视频女| 成人鲁丝片一二三区免费| 国产精品无大码| 欧美成人精品欧美一级黄| 少妇被粗大猛烈的视频| 老师上课跳d突然被开到最大视频| 爱豆传媒免费全集在线观看| 小蜜桃在线观看免费完整版高清| 日本免费a在线| 日韩,欧美,国产一区二区三区| 免费观看无遮挡的男女| 熟女电影av网| 亚洲欧美中文字幕日韩二区| 日本与韩国留学比较| 成人亚洲精品av一区二区| 国产伦精品一区二区三区视频9| 尤物成人国产欧美一区二区三区| 亚洲av成人av| 尾随美女入室| 嫩草影院入口| 亚洲第一区二区三区不卡| 国产老妇伦熟女老妇高清| 日韩人妻高清精品专区| 麻豆乱淫一区二区| 久久精品夜夜夜夜夜久久蜜豆| 老女人水多毛片| 精品久久久久久久久av| 搞女人的毛片| 色播亚洲综合网| 你懂的网址亚洲精品在线观看| 成人性生交大片免费视频hd| 国产乱人视频| 特大巨黑吊av在线直播| 国产精品国产三级国产av玫瑰| 亚洲在线观看片| 国产视频首页在线观看| 黄色一级大片看看| 大香蕉97超碰在线| 免费看a级黄色片| 精品久久久久久成人av| 亚洲国产欧美在线一区| 国产精品人妻久久久久久| 69人妻影院| 国产有黄有色有爽视频| 亚洲精品第二区| 婷婷色综合大香蕉| 国产亚洲av片在线观看秒播厂 | 欧美高清性xxxxhd video| 一本—道久久a久久精品蜜桃钙片 精品乱码久久久久久99久播 | 伦理电影大哥的女人| 成人午夜高清在线视频| 日本免费在线观看一区| 一本一本综合久久| 久久99精品国语久久久| 国产91av在线免费观看| 五月玫瑰六月丁香| 亚洲无线观看免费| 一区二区三区免费毛片| 22中文网久久字幕| 晚上一个人看的免费电影| 美女黄网站色视频| 人人妻人人看人人澡| 两个人视频免费观看高清| 一级毛片黄色毛片免费观看视频| 国产麻豆成人av免费视频| 免费观看a级毛片全部| 高清在线视频一区二区三区| 午夜激情久久久久久久| 亚洲在久久综合| 插阴视频在线观看视频| 午夜日本视频在线| 中文字幕免费在线视频6| 国产精品国产三级国产av玫瑰| 赤兔流量卡办理| 最近中文字幕高清免费大全6| 国产av码专区亚洲av| 能在线免费观看的黄片| 免费观看a级毛片全部| 国产色爽女视频免费观看| 亚洲av福利一区| 久久亚洲国产成人精品v| 大片免费播放器 马上看| 欧美精品一区二区大全| 国产v大片淫在线免费观看| 九九久久精品国产亚洲av麻豆| 国产日韩欧美在线精品| 欧美日韩视频高清一区二区三区二| 欧美日韩综合久久久久久| 亚洲国产精品成人久久小说| 美女大奶头视频| 国产 一区 欧美 日韩| 噜噜噜噜噜久久久久久91| 少妇高潮的动态图| 亚洲欧美中文字幕日韩二区| 3wmmmm亚洲av在线观看| 久久久久久久亚洲中文字幕| 欧美性猛交╳xxx乱大交人| 国产高清不卡午夜福利| 非洲黑人性xxxx精品又粗又长| 搡女人真爽免费视频火全软件| 日韩电影二区| 日韩av在线大香蕉| 精品人妻偷拍中文字幕| 内地一区二区视频在线| 一级毛片电影观看| 亚洲经典国产精华液单| 少妇人妻精品综合一区二区| 久久久成人免费电影| 国产欧美日韩精品一区二区| 一个人观看的视频www高清免费观看| 天堂中文最新版在线下载 | 国产乱人偷精品视频| 午夜激情久久久久久久| 看免费成人av毛片| 男人和女人高潮做爰伦理| 精品人妻视频免费看| 亚洲人成网站高清观看| 日韩强制内射视频| 国产精品1区2区在线观看.| 午夜激情欧美在线| www.色视频.com| 国产亚洲精品av在线| 国产成人a∨麻豆精品| 日韩人妻高清精品专区| 直男gayav资源| 国产亚洲最大av| 最近中文字幕2019免费版| 国产精品国产三级国产专区5o| av免费在线看不卡| 亚洲,欧美,日韩| 美女cb高潮喷水在线观看| 免费看a级黄色片| 午夜久久久久精精品| 国产视频首页在线观看| 美女脱内裤让男人舔精品视频| 亚洲精品成人av观看孕妇| 国产精品久久久久久精品电影| 国产亚洲精品av在线| 一本久久精品| 97人妻精品一区二区三区麻豆| 精品一区二区三卡| 午夜福利视频精品| 亚洲精品日韩在线中文字幕| 国产大屁股一区二区在线视频| 国产精品女同一区二区软件| 国内精品美女久久久久久| 中文字幕制服av| 一级毛片电影观看| 99视频精品全部免费 在线| 国产视频内射| 亚洲国产欧美在线一区| 国产成人精品福利久久| 少妇人妻精品综合一区二区| 亚洲av男天堂| 十八禁网站网址无遮挡 | 亚洲精品日韩av片在线观看| 日韩电影二区| 国产精品久久久久久av不卡| 国产亚洲5aaaaa淫片| 亚洲av成人精品一区久久| 国产av国产精品国产| 久久久亚洲精品成人影院| 又爽又黄a免费视频| 一级毛片久久久久久久久女| 永久免费av网站大全| 久久久久久久国产电影| 国产高清国产精品国产三级 | 免费无遮挡裸体视频| 在线免费观看不下载黄p国产| 国产成人91sexporn| 欧美+日韩+精品| 午夜爱爱视频在线播放| 色综合亚洲欧美另类图片| 欧美激情国产日韩精品一区| av女优亚洲男人天堂| 精品久久久久久电影网| 在线播放无遮挡| 老司机影院成人| 亚洲欧美成人综合另类久久久| 久久久欧美国产精品| 久久国内精品自在自线图片| 久久精品久久久久久噜噜老黄| 18+在线观看网站| 午夜精品在线福利| 日韩不卡一区二区三区视频在线| 久久久久免费精品人妻一区二区| 在线 av 中文字幕| 亚洲伊人久久精品综合| 国产精品99久久久久久久久| 乱系列少妇在线播放| 99久久中文字幕三级久久日本| 成人高潮视频无遮挡免费网站| 国产麻豆成人av免费视频| 我的女老师完整版在线观看| 亚洲最大成人中文| 秋霞在线观看毛片| 亚洲18禁久久av|