• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Adaptive Error Curve Learning Ensemble Model for Improving Energy Consumption Forecasting

    2021-12-15 08:12:32PrinceWaqasKhanandYungCheolByun
    Computers Materials&Continua 2021年11期

    Prince Waqas Khan and Yung-Cheol Byun

    Department of Computer Engineering,Jeju National University,Jeju-si,Korea

    Abstract:Despite the advancement within the last decades in the field of smart grids,energy consumption forecasting utilizing the metrological features is still challenging.This paper proposes a genetic algorithm-based adaptive error curve learning ensemble (GA-ECLE) model.The proposed technique copes with the stochastic variations of improving energy consumption forecasting using a machine learning-based ensembled approach.A modified ensemble model based on a utilizing error of model as a feature is used to improve the forecast accuracy.This approach combines three models,namely Cat-Boost (CB),Gradient Boost (GB),and Multilayer Perceptron (MLP).The ensembled CB-GB-MLP model’s inner mechanism consists of generating a meta-data from Gradient Boosting and CatBoost models to compute the final predictions using the Multilayer Perceptron network.A genetic algorithm is used to obtain the optimal features to be used for the model.To prove the proposed model’s effectiveness,we have used a four-phase technique using Jeju island’s real energy consumption data.In the first phase,we have obtained the results by applying the CB-GB-MLP model.In the second phase,we have utilized a GA-ensembled model with optimal features.The third phase is for the comparison of the energy forecasting result with the proposed ECL-based model.The fourth stage is the final stage,where we have applied the GA-ECLE model.We obtained a mean absolute error of 3.05,and a root mean square error of 5.05.Extensive experimental results are provided,demonstrating the superiority of the proposed GA-ECLE model over traditional ensemble models.

    Keywords:Energy consumption;meteorological features;error curve learning;ensemble model;energy forecasting;gradient boost;catboost;multilayer perceptron;genetic algorithm

    1 Introduction

    Predicting energy consumption remains a problematic and mathematically demanding task for energy grid operators.Current prediction methods are typically based on a statistical analysis of the load temperature observed in various channels and generating a warning if a critical threshold is reached.However,the latest computer science advances have shown that machine learning can be successfully applied in many scientific research fields,especially those that manipulate large data sets [1-3].Many researchers have proposed using the residual compensation method to improve the accuracy.Such as Su et al.[4]presented an improved collaborative research framework for predicting solar power generation.An enhanced collection model is being proposed to improve accurate predictions based on a new waste adaptation solution evolutionary optimization technique.They applied this solution to the solar projection using an asymmetric estimator.They conducted extensive case studies on open data sets using multiple solar domains to demonstrate Adaptive Residual Compensation integration technology’s benefits.In this article by Christian et al.[5],the authors have proposed a new method of predicting thunderstorms using machine learning model error as a feature.Their work’s basic idea is to use the two-dimensional optical flow algorithm’s errors applied to meteorological satellite images as a function of machine learning models.They interpret that an error in the speed of light is a sign of convection and can lead to thunder and storms.They wrap using various manual steps to consider the proximity to space.They practice various tree classification models and neural networks to predict lightning over the next few hours based on these characteristics.They compared the effects of different properties on the classification results to the predictive power of other models.Another work of using machine learning model’s error is done by Zhang et al.[6].The authors proposed a way to train a single hidden layer feed-forward neural network using an extreme learning machine.The proposed method presented found an effective solution to the regression problem due to the limited modeling ability,the problem’s nonlinear nature,and the regression problem’s possible nature.Extreme learning machine-based prediction error is unavoidable due to its limited modeling capability.This article proposes new Extreme learning machines for regression problems,such as the Extreme learning machine Remnant Compensation,which uses a Multilayered framework that uses baselines to compensate for inputs and outputs other levels of correction.Post-Layer Levels The proposed residual compensation-based-extreme learning machine may also be a general framework for migration issues.However,they did not combine the proposed approach with deep learning schemes.

    We have proposed a modified ensemble model to improve the forecast accuracy based on utilizing the model’s error as a feature.This approach combines three models,namely CatBoost (CB),Gradient Boost (GB),and Multilayer Perceptron (MLP).The ensembled CB-GB-MLP model’s inner mechanism consists of generating a meta-data from Gradient Boosting and CatBoost models to compute the final predictions using the Multilayer Perceptron network.A genetic algorithm is used to obtain the optimal features to be used for the model.To prove the proposed model’s effectiveness,we have used a four-phase technique using South Korea’s Jeju province’s actual energy consumption data.Jeju Island is located on the southernmost side of the Korean peninsula.The solar altitude remains high throughout the year,and in summer,it enters the zone of influence of tropical air masses.It is situated in the Northwest Pacific Ocean,which is the Pacific Ocean’s widest edge and is far from the Asian continent and is affected by the humid ocean [7].The foremost contributions of this article are to

    · combine three models of machine learning,namely Catboost,Gradient Boost,and Multilayer Perceptron,

    · utilizing a genetic algorithm for the feature selection,

    · using error of model as a feature to improve the forecast accuracy.

    The remainder of the article is arranged as follows.Section 2 introduces preliminaries about machine learning techniques used in this publication.Section 3 presents the four-stage proposed methodology.Section 4 introduces the data collection,data analysis process,pre-processing,and training.Section 5 presents the performance results of the proposed model evaluated using Jeju energy consumption data.It also analyzes the results with the existing models.Lastly,we conclude this article in the last conclusion section.

    2 Preliminaries

    Artificial neural networks and machine learning provide better results than traditional statistical prediction models in various fields.Li [8]proposed a short-term forecasting model that included extensive data mining and several continuous forecasting steps.To reduce the hassle with noise in the short term,noise reduction methods are used based on analysis and reconstruction.The phase reconstruction method is used to determine the dynamics of testing,training,and neuron configuration of the artificial neural network (ANN).It also improves the ANN parameter by applying a standard grasshopper optimization algorithm.The simulation results show that the proposed model can predict the load in the short term using various measurement statistics.However,other factors or parameters can be considered in the forecasting model to optimize the short-term load prediction.The focus should be on developing data processing technology that can manage short,erratic,and unstable data so users can manage the negative effects of noise.We have proposed to ensemble three machine learning models followed by the genetic algorithm optimization technique.This section introduces these three ML models according to their distinguished architecture and uses in literature,namely,Gradient Boosting model,CatBoost Model,and Multilayer perceptron.Moreover,the Genetic algorithm and Ensemble model approaches are also discussed.

    2.1 Gradient Boosting Model

    Gradient Boosting model is a robust machine learning algorithm developed for various data such as computer vision [9],chemical [10],and biological fields [11],and energy [12].Tree-based ensemble methods are gaining popularity in the field of prediction.In general,combining a simple low return regression tree gives high accuracy of forecasting.Unlike other machine learning techniques,Tree-based ensemble methods are considered black boxes.Tree-based grouping techniques provide interpretable results,have little data processing,and handle many different types of predictions.These features make tree-based grouping techniques an ideal choice for solving travel time prediction problems.However,the application of the tree-based clustering algorithm in traffic prediction is limited.In the article,Zhang et al.[13]use the Gradient Boosted Regression Tree method to analyze and model highway driving time to improve prediction accuracy and model interpretation.The gradient boosting method could strategically combine different trees with improving prediction accuracy by correcting errors generated by the previous base model.They discussed the impact of other parameters on model performance and variable input/output correlations using travel time data provided by INRIX on both sections of the Maryland Highway.They compared the proposed method with other hybrid models.The results show that gradient boosting has better performance in predicting road transit times.

    In the study,Touzani et al.[14],a modeling method based on motor power consumption using gradient boosting algorithm,is proposed.A recent testing program has been used in an extensive data set of 410 commercial buildings to assess this method’s effectiveness.The model training cycle evaluates the sample performance of different predictive methods.The results show that the use of a gradient boosting compared to other machine learning models improves R-square and root mean square error estimates by more than 80%.

    2.2 Catboost Model

    CatBoost is a gradient boosting algorithm-based open-source machine learning library.It can successfully adjust the categorical features and use optimization instead of adjusting time during training [15].Another advantage of this algorithm is that it reduces the load and uses a new system to calculate the leaves’value in determining the shape of the tree [16].It is mainly used to solve logical problems and to solve functions efficiently.High-quality samples can be obtained without parameter adjustment,and better results can be obtained by using a default parameter,thereby reducing the conversion time.It also supports categorical parts without prefixing non-numerical properties.In work by Diao et al.[17],catboost is used for short-term weather forecasting using the datasets provided by Beijing Meteorological Administration.The corresponding heat map,the removal of the recursive features,and the tree pattern are included in selecting the features.Then they recommended the wavelet denoising of data and conducted it before setting up a learning program.The test results show that compared to most in-depth studies or machine learning methods,the catboost mode can shorten the connection time and improve the system.

    2.3 Multi-Layer Perceptron

    The multi-layer perceptron (MLP) consists of simple systems connected to neurons or nodes.The nodes involved in the measurement and output signals result from the amount of material available for that node,which is configured by the function or activation of a simple nonlinear switch [18].This is a feature of a straightforward nonlinear transition,allowing nonlinear embedding functions in a multi-layer perceptron.The output is measured by connection weight and then sent to the node in the next network layer as an input.The multi-layer perceptron is considered a feed-forward neural network.In the study by Saha et al.[19],a multi-layer perceptron model was used to predict the location of potential deforestation.Group-level success rates and levels are compared to MLP.The described unit has been trained on 70% of deforestation;the remaining 30% have been used as test data.Distance to settlement,population growth,and distance to roads were the most important factors.After assembling the MLP neural network calibration with the hybrid classifier,the accuracy is increased.The described method can be used to predict deforestation in other regions with similar climatic conditions.

    The MLP algorithm is trained to calculate and learn the weights,synapses,and neurons of each layer [20].MLP can be calculated using Eq.(1).

    where the sigmoid function isα,ωis the weight,and k denotes the multi-layer perceptron model’s output value,whereasykis the output of the MLP.yiis given as the input for the first layer to obtain the kth output.i indicates the number of the hidden layer from one to n.wi;k is the connection between the ith hidden layer neuron and the kth output layer neuron.

    2.4 Genetic Algorithm

    Genetic algorithm (GA) was inspired by Darwin’s theory of evolution,in which the survival of organisms and more appropriate genes were simulated.GA is a population-based algorithm.All solutions correspond to the chromosomes,and each parameter represents a gene [21].GA uses the (objective) fitness function to assess the fitness of everyone in the population.To improve the poor solution,it is randomly selected using the optimal solution selection mechanism.The GA algorithm starts with a random set.Populations can be generated from Gaussian random distributions to increase diversity.Inspired by this simple natural selection idea,we use the GA-algorithm roulette wheel to select them to create a new generation that matches the fitness value assigned to an individual with probability.After choosing an individual with a limiting factor,this is used to create a new generation.The last evolutionary factor in which one or more genes are altered after making an eye solution.The mutation rate is set low because the high GA mutation rate translates GA into a primitive random search term [22].The mutation factor maintains the diversity of the population by introducing different levels of randomness.

    2.5 Ensemble Model

    Ensemble refers to the integration of prediction models to perform a single prediction.Ensemble models can combine different weak predictive models to create more robust predictive models [23-25].The Blending Ensemble model uses not only the predicted value but also the original feature.Fig.1 shows the Graphical representation of the ensemble machine learning approach.Where the dataset is training in different models,and their initial predictions are ensembled using ensembler.In the article by Massaoudi et al.[26],the authors propose a framework by using Light Gradient Boosting Machine (LGBM),eXtreme gradient Boosting machine (XGB),and Multilayer Perceptron (MLP) for calculating effective short-term forecasts.The proposed method uses the heap generalization method to identify potential changes in housing demand.The proposed method is the combination of three models LGBM,XGB,and MLP.The Stacked XGB-LGBMMLP model’s inner mechanism consists of producing metadata from XGB and LGBM models to compute the final predictions using the MLP network.The performance of the proposed XGBLGBM-MLP stack model is tested using two data sets from different sites.The proposed method succeeded in reducing errors;however,the performance of the proposed method of stacking would decrease up to 48 h slower.

    Figure 1:Graphical representation of the ensemble machine learning approach

    In the article by Park et al.[27],the author proposed a daily hybrid forecasting method based on load feature decomposition.Short-term load forecasting can be implemented by adding public pair forecasts composed of a combination of different intricate energy consumption patterns.However,due to resource constraints for measurement and analysis,it may not be possible to track all predicted sub-load usage patterns.The proposed method to prevent this feasibility focuses on general characteristics,low load,and effective decomposition of typical pilot signals.Using the proposed method,intricate energy consumption patterns can be grouped based on characteristic contours,and the combined charges can be decomposed into sub-charges in the group.The ensembled prediction model is applied to each sub load with the following characteristics to predict the total charge by summarizing the expected cluster load.Hybrid prediction combines CLD-based linear prediction with LSTM,which is superior to traditional methods of inaccurate prediction.We use a hybrid method that combines conditional linear prediction of job type and short-term,long-term memory regression to obtain a single prediction that decomposes feature sub loads.Consider complex campus load data to evaluate the hybrid forecast of proposed features during load decomposition.The evaluation found that the proposed plan outperformed hybrid or similar historical-based forecasting methods.The sub-load can be measured in a limited time period,but the decomposition technique can be applied to the extended training data through virtual decomposition.

    3 Proposed Methodology

    We have designed a four-phase strategy better to understand the impact of the error curve learning technique.Fig.2 shows the abstract of these four phases.In the first phase,we have obtained the results by applying the CB-GB-MLP model.This approach combines three models,namely CatBoost (CB),Gradient Boost (GB),and Multilayer Perceptron (MLP).The ensembled CB-GB-MLP model’s inner mechanism consists of generating a meta-data from Gradient Boosting and CatBoost models to compute the final predictions using the Multilayer Perceptron network.In this phase,70% of the original dataset is used for training purposes and 30% for testing.In the second phase,we have utilized a GA-ensembled model with optimal features.The Input feature for energy consumption forecasting consists of weather,time,and holidays.

    Figure 2:Design of the four-phase strategy

    A genetic algorithm is used to obtain optimal features.The CB-GB-MLP model is then used for the forecasting using the same data division scheme as used in phase 1.By applying the GAensembled model,we obtained better performance results.By using Eq.(2),the error curve is obtained at this stage.Whereytis the actual value of energy consumption andis the predicted value.

    The third phase is for the comparison of the energy forecasting result with the proposed ECL-based model.In this stage,we have used the testing dataset of phase 2 as the complete dataset.Then we split this data into 70% and 30% for training and testing,respectively.

    The fourth stage is the final stage,where we have applied the genetic algorithm-based error curve learning ensembled (GA-ECLE) model.For this phase,we have again used the testing dataset of phase 2 as the complete dataset.Then we split this data again into 70% and 30%for training and testing,respectively.However,this time we have used the error data obtained at phase 2 as an input feature along with weather,time,and holiday features.By utilizing the GA-ECLE model,we obtained comparatively good results.Fig.3 explains the flow diagram of the proposed GA-ECLE model.It starts with acquiring the input data from different sources,then doing some feature engineering such as obtaining date features and then pre-processing it.Preprocessing involves filling the missing values and converting the textual data into numeric data for better training.The pre-processed data is then split into training and testing datasets.The genetic algorithm is used to obtain optimal features;then,these optimal features are passed to an ensembled model where test data is used to generate the error curve.The ensemble model is trained again using the error curve and then obtaining the ensembled model’s optimal predictions.

    Time series data is used to evaluate the performance of the proposed model.Eq.(3) is used to estimate the predicted value of the next timestamp.The data we have used is hourly based,so here t represents one Hour of time.y(t+1)is the forecasted value at a given time,y(t)represents the hourly load consumption at time t,and F represent the Features used to estimate the future load value.As expressed in Eq.(4) there are four different classes of features used.WhereWFst,DFt,HFt,andEFtrepresents the weather,date,holiday,and error features,respectively.

    There are four different weather stations in Jeju island named as Jeju-Si,Gosan,Sungsan,and Seogwipo.Set of weather features from each weather stationWFst,is represented in Eq.(5).Meteorological features consist of Average temperatureTAt,the temperature of dew pointTDt,humidityHMt,wind speedWSt,wind direction degreeWDt,atmospheric pressure on the groundPAtdiscomfort indexDIt,sensible temperatureSTt,and solar irradiation quantitySIt.Each weather station is assigned a different weight factor according to its importance as described in Eq.(6).Where the value of a is equal to 0:5,b and g is equal to 0:1 and d is 0:3.WFtj,β.WFtg,γ.WFtss,andδWFtsprepresent the set of weather features,according to the weather stations of Jeju-Si,Gosan,Sungsan and Seogwipo respectively.

    Figure 3:Flow diagram of the proposed GA-ECLE framework

    Date featuresDFtcan be extracted from the time-series data,which helps understand the relation between the target value with respect to the time [28].The set of date features as described in Eq.(7) consist of HourHt,,monthMt,,yearYt,,QuarterQt,and day of weekDoWt,.Holidays also greatly impact energy consumption;we have collected the holidays and make a set of holiday featuresHFtas expressed in Eq.(8).

    It contains holiday codeHDt,,special day codeSDCt,,special day nameSDNt,.Holiday code consists of holidays such as solar holiday,lunar holiday,election day,holidays interspersed with workdays,alternative holidays,change in demand,and special days.Special days consist of several holidays,including New Year’s Day,Korean army day,Korean New Year’s Day,Christmas,workers’day,children’s day,constitution day,and liberation day.

    We have proposed to use the error of modelEFtas a feature.It contains two features as explained in Eq.(9).ECtis calculated using Eq.(2) and absolute error curveAECtusing Eq.(10).

    The target feature is the total hourly based load consumption.Jeju island fulfills its energy need from different sources.These sources include renewable and non-renewable energy sources such as wind energy and solar energy.We have combined all the load consumption energy sources using Eq.(11).Whereshows the energy load using fossil fuel-based energy sources,is for behind the meter sources.represents load from photovoltaic sources and load consumption from wind energy sources are represented as.

    The proposed hybrid model consists of three sub-models,Catboost,gradient boost,and multilayer perceptron.The output of p roposed hybrid modelis calculated using Eq.(12).Whererepresents the output of Catboost,gradient boost and multi-layer perceptron models,respectively.Cwrepresent the weight coefficient of each model and it is calculated using Eq.(13).In this Eq.time t consist of M values,ytis the actual value andis the predicted output.

    4 Forecasting Using GA-ECLE Model

    This section explains the complete data acquisition,data analysis,feature engineering,and proposed model training.Fig.4 shows the block diagram of the proposed forecasting framework in the context of the Jeju island’s weather and energy consumption patterns.We have used the actual energy consumption data of Jeju island South Korea.This island has four weather stations named Jeju-si,gosan,sungsan,and seogwipo.Input variables for this model are of four different types.First is the meteorological features from four different weather stations.The second is holidays,the third is time features,and the Fourth is energy sources,which comprise of fossil fuelbased energy sources (FF),photovoltaic (PV),behind-the-meter (BTM),and wind power (WP)energy sources.

    Figure 4:Block diagram of the proposed forecasting framework

    Prepossessing this data involves different functions such as data cleaning,one-hot encoding,imputation,and feature construction.We have also assigned different weights to the weather features according to the impact of each weather station.This pre-processed data is used as input for the genetic algorithm,which helps obtain optimal features according to their importance in prediction.The initial number of features was 64,and it was reduced to 32 after applying the genetic algorithm.We provided error and absolute error as features along with other holidays,meteorological,and date features.These features served as input to the ECLE model.This enabled model consists of three models,namely CatBoost (CB),Gradient Boost (GB),and Multilayer Perceptron (MLP).The ensembled CB-GB-MLP model generates meta-data from Gradient Boosting and CatBoost models and computes the final predictions using Multilayer Perceptron.We have used different evaluation metrics such as root mean square error (RMSE) and mean absolute percentage error (MAPE) to evaluate our proposed model.

    The pseudo-code for the genetic algorithm-based error curve learning model is expressed stepwise in Algorithm 1.It initializes with importing actual data files and libraries such as NumPy,pandas,matplotlib.Then data is pre-processed using imputation,converting textual data into numeric data using one-hot encoding and assigned weightsα,β,γ,andδto the meteorological features according to the weather stations.A genetic algorithm is used to obtain optimal features.The next step is to build and train an ensembled hybrid modelmodelhybrid.This model contains CatBoostRegressor() and GradientBoostingRegressor() to obtain metadata and MLPRegressor()to compute the final predictions.The pre-processed data is divided into two parts one is for training and the other for testing purposes.After training of this model,testing data is used to generate an error curve.The error features are used to retrain the model and meteorological,holiday,and date features.The final step is the evaluation and getting predictions using a trained model.

    Algorithm 1:Pseudo-code for Adaptive Error Curve Learning Ensemble Model Step 1:Preliminaries 1:import libraries.2:data=import CSV data file 3:Step 2:Preprocessing 4:Fill Null values 5:one hot encoding=encoding:fit (dff[“SDCt′′])6:WFst=α.WFjt,β.WFgt,γ.WFsst,δ.WFspt 7:Apply genetic algorithm to get optimal features.8:Step 3:Build and train model.9:modelhybrid=BlendEnsemble()10:modelhybrid.addmeta(CatBoostRegressor())11:modelhybrid.addmeta (GradientBoostingRegressor())12:modelhybrid.add(MLPRegressor())13:train=data.loc[data.index<=datesplit].copy()14:test=data.loc[data.index>datesplit].copy()15:modelhybrid.fit(trainX;trainy)16:Step 4:Get Error curve 17:ECt=yt+yt yt×100 18:AECt=|ECt|19:Step 5:Retrain model using Error curve.20:use error features and repeat Step 3.21:Step 6:Evaluation and prediction 22:Evaluate the model using RMSE,MAE.23:Prediction=modelhybrid.predict(testX)

    4.1 Exploratory Data Analysis

    This section provides the exploratory data analysis to understand the patterns of data better.The data provided by the Jeju energy corporation consist of different features.These features include the energy consumption of different energy sources,weather information of four weather stations,and holiday information.The total dataset consists of hourly-based energy consumption from 2012 to mid of 2019.Fig.5 shows the monthly energy consumption in the complete dataset.

    Figure 5:Monthly average load consumption for each year

    Each line shows a different year from 2012 to 2019.The X-axis represents the month,and Y-axis represents the average energy consumption in MWs.Average energy consumption is low in the month of November and high during August and January.

    Fig.6 shows the average weather temperate recorded at different Jeju island weather stations after applying weight factors.Each line shows a different year weather station’s data.2019.The X-axis represents the time frame,and Y-axis represents the average temperature in Celsius.

    Figure 6:Average temperature after applying weight for each weather station

    Fig.7 shows the energy consumption with respect to different seasons.Mean energy consumption is highest during the winter season and lowest during the spring season.

    Fig.8 shows the mean energy consumption during each year.The gradual increase in consumption can be observed from this chart.

    Figure 7:Season wise average load consumption

    Figure 8:Year-wise average load consumption

    Tab.1 summarizes the variable used in the dataset.It contains the variable name,their description,and measuring unit.Day of the week is represented from 0 to 6 starting from Sunday as 0.This table shows the input data consist of Code of Day of the Week,Holiday Code,Special Day Code,Special Day Name,Total Load,Temperature,Temperature of Due Point,Humidity,Wind Speed,Wind Direction Degree,Atmospheric Pressure on the Ground,Discomfort Index,Sensible Temperature,and Solar Irradiation Quantity.This data represents the problem adequately because it contains various weather and day features that have a great impact on the energy consumption.

    Table 1:Measuring units of features

    Weekdays and holidays have assigned binary numbers,where weekday is represented as 0 and holiday as 1.Total load represents the hourly energy consumption in MW.Temperature,due point temperature,and moderate temperature are measured in Celsius.Other features used are humidity in percentage,wind speed in m/s,wind direction degree in deg,atmospheric pressure on the ground in hpa,and solar irradiation quantity in Mj/m2.

    4.2 Training and Testing

    The total dataset consists of hourly-based energy consumption from 2012 to 2019.It contains 64999 Rows.Tab.2 summarizes the training and testing data for each phase.

    Table 2:Training and testing data for each phase

    Fig.9 shows the graphical representation of testing and training data split for phase 1 and phase 2.The green color is used to represent the training data,and the blue color shows the testing phase data.For phase 1 and phase 2,we have used 62 months as training duration,which contains 45476 rows.The last 26 months are used as testing data and to obtain the error curve,which includes 19523 rows.

    Figure 9:Training and testing data for Phase 1 and Phase 2

    The graphical representation of testing and training data split for phase 3 and phase 4 is shown in Fig.10.For phase 3 and phase 4,we have used 18 months 20 days of the training data,consisting of 13680 rows from 10th of March 2017 to 29th of September 2018.We have used the remaining last months as testing data,consisting of 5763 rows from the 30th of September 2018 to the 31st of May 2019.

    Figure 10:Training and testing data for Phase 3 and Phase 4

    5 Experimental Results and Evaluation

    The accuracy of machine learning techniques must be verified before implementation in the real-world scenario [29].In this study,two methods are used to perform the accuracy and reliability of assessment procedures.One is a graphic visualization,and the other is error metrics.Error rating metrics used in the model evaluation method include mean absolute error (MAE),mean absolute percentage error (MAPE),root mean square error (RMSE),and root mean squared logarithmic error (RMSLE).

    Fig.11 shows the visual representation of Feature importance.The graph shows the order of features with respect to their importance after applying the genetic algorithm.The X-axis represents the feature importance score obtained using GA,and the Y-axis represents the feature’s name.The data is collected on an hourly basis,so it can be observed that the hour feature is of great importance in prediction,followed by the year feature.Then there are some weather features.We have removed the features with less importance to improve the results.There were 64 features,but after applying the genetic algorithm,the top 32 features are selected.

    Fig.12 shows the error curve obtained during phase 2.This error curve is used as an input feature along with other weather and date features during phase 4.This graph shows the variation in error on an hourly basis.The X-axis shows the month,and Y-axis shows the percentage of error.

    Figure 11:Feature importance graph after applying genetic algorithm

    The difference between prediction and actual values can be expressed as mean absolute error.It can be calculated using Eq.(14).Whereyarepresents actual value andypis the predicted value.First,we calculate all absolute errors,add all up,and then divide by the total number of errors expressed in the Eq.as N.The proposed model’s MAE is 3.05.Mean squared error is calculated using Eq.(15).It represents the difference between the original value and the expected value [30].First,we calculate all absolute errors,take the square of them,and then divide by the total number of errors.The MSE of the proposed model is 115.09.RMSE is obtained by Eq.(16),It is calculated by the root of MSE.Whereypis the forecasted value obtained by the proposed model,andyais the actual load value.N represents the sample size in terms of numbers [31].The RMSE of the proposed model is 5.05.

    Figure 12:Error curve generated from phase 2

    The root mean squared logarithmic error (RMSLE) is obtained by Eq.(17).The root mean squared logarithmic error expressed the logarithmic relationship between the actual data value and the model’s expected value.The RMSLE of the proposed model is 0.008.

    Tab.3 summarizes the error during all phases of experimental results.It covers the minimum,maximum,and mean error of testing data according to each phase.From the table,it is evident that there is a significant improvement in the results when we applied ECL.We have obtained a maximum of 10.80% error,and the mean error recorded was 1.22%.We also performed a comparison of the proposed GA-ECLE model with the state-of-the-art models including.Tab.4 shows the comparison of different evaluation metrics.

    Table 3:Comparison of error

    Table 4:Model accuracy evaluation metrics

    The prediction result of testing data is also illustrated using graphs.In Phase 4,data from the 30th of September 2018 to the 31st of May 2019 is used as testing data.Fig.13 represents the graphical comparison between the results of applying ECL.

    Figure 13:Comparison of Actual test data and prediction with and without ECL

    Green lines show the actual load values,orange lines show the prediction result after applying ECL,and cyan color lines represent the prediction using the same model but without ECL.

    To better visualize the results,we have selected two different time frames.One is 48 h,and the other is one week or 168 h.Fig.14 shows the results between 48 h,which comprised from the 20th of May 2019 to the 22nd of May 2019.Fig.15 shows the graphical representation of actual predictions and their difference for one week.The week we selected starts on the 15th of February 2019 and ends on the 21st of February 2019.In both figures,the green line shows the actual load consumption,and the orange line represents the prediction with ECL.Whereas the difference in prediction and actual consumption is illustrated using purple color.

    Figure 14:Comparison of 48 h ahead prediction with and without ECL

    Figure 15:Comparison of the week ahead prediction with and without ECL

    6 Conclusions

    This research presents a novel genetic algorithm-based adaptive error curve learning ensemble model.The proposed technique sorts random variants to improve energy consumption predictions according to a machine learning-based ensembled approach.The modified ensemble model use error as a function to improve prediction accuracy.This approach combines three models:Cat-Boost,Gradient Boost,and Multilayer Perceptron.A genetic algorithm is used to obtain the best properties available in the model.To prove the proposed model’s effectiveness,we used a four-step technique that utilized Jeju Island’s actual energy consumption data.In the first step,the CB-GBMLP model was applied,and the results were obtained.In the second phase,we used a large,full-featured GA model.The third step is to compare the energy prediction results with the proposed ECL model.The fourth step is the final step of applying the GA-ECLE model.Extensive experimental results have been presented to show that the proposed GA-ECLE model is superior to the existing machine learning models such as GradientBoost,CatBoost,Multilayer Perceptron,LSTM,Xgboost,and Support Vector Regressor.The results of our approach seem very promising.We obtained a mean error of 1.22%,which was 5.75%,without using the proposed approach.We have presented the results in a graphical way along with the statistical comparison.The empirical results seem mostly favorable for the applicability of the proposed model in the industry.In the future,other features can be added,such as the impact of the population using electricity and the number of electric vehicles.

    Funding Statement:This research was financially supported by the Ministry of Small and Mediumsized Enterprises (SMEs) and Startups (MSS),Korea,under the “Regional Specialized Industry Development Program (R&D,S2855401)” supervised by the Korea Institute for Advancement of Technology (KIAT).

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.

    一级片免费观看大全| 国产又爽黄色视频| 中文欧美无线码| netflix在线观看网站| 麻豆国产av国片精品| 侵犯人妻中文字幕一二三四区| 国产精品九九99| 咕卡用的链子| 在线观看日韩欧美| 丁香欧美五月| 在线播放国产精品三级| 中国美女看黄片| 日韩欧美三级三区| 成人av一区二区三区在线看| 国产成人欧美在线观看 | 999久久久国产精品视频| 免费在线观看日本一区| 99riav亚洲国产免费| 国产日韩欧美亚洲二区| 亚洲欧美日韩高清在线视频| 精品国产一区二区久久| 美女高潮到喷水免费观看| 亚洲三区欧美一区| 法律面前人人平等表现在哪些方面| 日本a在线网址| 极品少妇高潮喷水抽搐| 两性午夜刺激爽爽歪歪视频在线观看 | 国产精品香港三级国产av潘金莲| 午夜精品在线福利| 成年人午夜在线观看视频| av片东京热男人的天堂| 99re在线观看精品视频| 女同久久另类99精品国产91| 国产亚洲一区二区精品| 亚洲一区二区三区不卡视频| 精品少妇久久久久久888优播| 国产精品一区二区在线不卡| 国产精品久久电影中文字幕 | 久久亚洲真实| 激情视频va一区二区三区| 欧美日韩福利视频一区二区| 美女高潮喷水抽搐中文字幕| 国产男靠女视频免费网站| 欧美日韩av久久| 丁香六月欧美| 午夜久久久在线观看| 国产精品偷伦视频观看了| 黄色视频不卡| 亚洲成人国产一区在线观看| 一a级毛片在线观看| 丝瓜视频免费看黄片| www日本在线高清视频| 国内久久婷婷六月综合欲色啪| 亚洲精品自拍成人| 久久人妻福利社区极品人妻图片| 久久亚洲精品不卡| 好男人电影高清在线观看| 亚洲中文字幕日韩| 国产精品国产高清国产av | 精品久久久久久,| 韩国av一区二区三区四区| 操出白浆在线播放| 少妇裸体淫交视频免费看高清 | 99热只有精品国产| 一a级毛片在线观看| 波多野结衣av一区二区av| 国产日韩欧美亚洲二区| 每晚都被弄得嗷嗷叫到高潮| 久久国产精品影院| 欧美日韩瑟瑟在线播放| 亚洲国产欧美网| 黄频高清免费视频| 搡老岳熟女国产| 亚洲一区中文字幕在线| 色综合欧美亚洲国产小说| 国产精品一区二区在线观看99| www.自偷自拍.com| 国产精品久久久久成人av| 首页视频小说图片口味搜索| 十八禁网站免费在线| 老熟妇仑乱视频hdxx| 女同久久另类99精品国产91| 午夜福利影视在线免费观看| 色老头精品视频在线观看| 色婷婷久久久亚洲欧美| 热99国产精品久久久久久7| 亚洲精品久久午夜乱码| 老司机午夜十八禁免费视频| 视频在线观看一区二区三区| 日韩熟女老妇一区二区性免费视频| 高清视频免费观看一区二区| 天天躁狠狠躁夜夜躁狠狠躁| 一级黄色大片毛片| 亚洲欧美日韩另类电影网站| 精品国产国语对白av| 亚洲中文日韩欧美视频| 精品少妇久久久久久888优播| 色尼玛亚洲综合影院| 久久精品91无色码中文字幕| 人妻 亚洲 视频| 韩国精品一区二区三区| 欧美乱码精品一区二区三区| 制服人妻中文乱码| 国产单亲对白刺激| 欧美日韩福利视频一区二区| 中文欧美无线码| 久久久久国内视频| 欧美日韩亚洲综合一区二区三区_| 视频区欧美日本亚洲| 日韩欧美一区视频在线观看| 好男人电影高清在线观看| 黑人巨大精品欧美一区二区mp4| 亚洲一区二区三区不卡视频| 99久久国产精品久久久| 日韩欧美免费精品| 人人妻人人爽人人添夜夜欢视频| 一进一出好大好爽视频| 久久天堂一区二区三区四区| 国产一区在线观看成人免费| 国产精品影院久久| 黑人巨大精品欧美一区二区mp4| 美女福利国产在线| 久久午夜综合久久蜜桃| 婷婷丁香在线五月| 精品欧美一区二区三区在线| 日韩有码中文字幕| 亚洲人成电影观看| tocl精华| 久9热在线精品视频| 国产精华一区二区三区| 1024视频免费在线观看| 欧美亚洲 丝袜 人妻 在线| 久久天躁狠狠躁夜夜2o2o| 亚洲精华国产精华精| 69av精品久久久久久| av福利片在线| 色综合欧美亚洲国产小说| 狂野欧美激情性xxxx| 欧美日韩亚洲综合一区二区三区_| 精品人妻在线不人妻| 中文字幕制服av| 水蜜桃什么品种好| 日日爽夜夜爽网站| 国产97色在线日韩免费| 一级作爱视频免费观看| 欧美在线一区亚洲| 欧美精品人与动牲交sv欧美| 露出奶头的视频| 老汉色av国产亚洲站长工具| 人人妻人人澡人人爽人人夜夜| 天天添夜夜摸| 亚洲av熟女| 午夜福利在线免费观看网站| 国产一区二区三区在线臀色熟女 | 色综合婷婷激情| 久久人人爽av亚洲精品天堂| 国产区一区二久久| 亚洲av第一区精品v没综合| 另类亚洲欧美激情| 国产男靠女视频免费网站| 久久久久久久国产电影| 女人被躁到高潮嗷嗷叫费观| 亚洲中文字幕日韩| 在线观看66精品国产| 欧美日韩av久久| 97人妻天天添夜夜摸| 亚洲五月色婷婷综合| 九色亚洲精品在线播放| 亚洲专区国产一区二区| 男人舔女人的私密视频| 欧美成人午夜精品| 亚洲精品成人av观看孕妇| 国产乱人伦免费视频| 色老头精品视频在线观看| 啦啦啦在线免费观看视频4| 国产成人免费无遮挡视频| 人人澡人人妻人| 我的亚洲天堂| 精品少妇久久久久久888优播| 久久青草综合色| 国产亚洲精品一区二区www | 97人妻天天添夜夜摸| 一区二区三区国产精品乱码| 国产激情久久老熟女| 国产亚洲欧美精品永久| 午夜久久久在线观看| 成人三级做爰电影| 国产精品久久电影中文字幕 | 亚洲av片天天在线观看| www日本在线高清视频| 黄色 视频免费看| 午夜福利乱码中文字幕| 亚洲精品美女久久av网站| 一夜夜www| a级毛片在线看网站| 欧美激情极品国产一区二区三区| 精品人妻在线不人妻| 国产日韩欧美亚洲二区| 天堂俺去俺来也www色官网| 9热在线视频观看99| 亚洲成人手机| 久久青草综合色| 国产成人免费无遮挡视频| 亚洲欧洲精品一区二区精品久久久| 国产片内射在线| 国产精品免费大片| 人人澡人人妻人| 中文字幕精品免费在线观看视频| 亚洲自偷自拍图片 自拍| 国产精品免费视频内射| 婷婷成人精品国产| 国产亚洲精品第一综合不卡| 两个人免费观看高清视频| 捣出白浆h1v1| 国产精品自产拍在线观看55亚洲 | 人人妻人人澡人人看| 日韩人妻精品一区2区三区| 人人妻人人爽人人添夜夜欢视频| 国产一卡二卡三卡精品| 岛国毛片在线播放| 99精国产麻豆久久婷婷| 1024视频免费在线观看| 国产极品粉嫩免费观看在线| 少妇的丰满在线观看| 久久影院123| 天天躁日日躁夜夜躁夜夜| 9191精品国产免费久久| 亚洲一区中文字幕在线| 国产高清视频在线播放一区| 另类亚洲欧美激情| 国产成人精品无人区| 别揉我奶头~嗯~啊~动态视频| 欧美日韩亚洲综合一区二区三区_| 国产伦人伦偷精品视频| 麻豆av在线久日| 中文字幕人妻熟女乱码| 国产精品欧美亚洲77777| 在线国产一区二区在线| 热re99久久精品国产66热6| 国产主播在线观看一区二区| 亚洲性夜色夜夜综合| 51午夜福利影视在线观看| 一夜夜www| 满18在线观看网站| tocl精华| 大码成人一级视频| 日日夜夜操网爽| 91精品国产国语对白视频| 国产成人精品久久二区二区免费| 水蜜桃什么品种好| 亚洲美女黄片视频| 十八禁网站免费在线| cao死你这个sao货| 无人区码免费观看不卡| 亚洲在线自拍视频| 一级a爱视频在线免费观看| 亚洲国产精品sss在线观看 | 在线观看免费午夜福利视频| 国产成人免费观看mmmm| 首页视频小说图片口味搜索| 久久 成人 亚洲| 亚洲av成人一区二区三| 亚洲成人免费电影在线观看| 在线观看免费视频日本深夜| 少妇粗大呻吟视频| 精品亚洲成a人片在线观看| 国产亚洲精品第一综合不卡| 久久精品aⅴ一区二区三区四区| 啦啦啦 在线观看视频| 人成视频在线观看免费观看| 国产97色在线日韩免费| 99国产精品99久久久久| 麻豆成人av在线观看| 亚洲第一av免费看| 精品久久久久久久毛片微露脸| 欧美精品一区二区免费开放| 中文字幕色久视频| 欧美乱色亚洲激情| 亚洲伊人色综图| 国产极品粉嫩免费观看在线| 午夜精品在线福利| av福利片在线| 免费久久久久久久精品成人欧美视频| 久久ye,这里只有精品| 亚洲精品国产色婷婷电影| 热re99久久国产66热| 国产视频一区二区在线看| 91成人精品电影| 亚洲成av片中文字幕在线观看| 亚洲av熟女| 极品人妻少妇av视频| 成人18禁高潮啪啪吃奶动态图| 亚洲专区字幕在线| 久久久久久久国产电影| 国产欧美日韩精品亚洲av| 久久精品成人免费网站| 免费不卡黄色视频| 精品高清国产在线一区| 中文亚洲av片在线观看爽 | 日韩三级视频一区二区三区| 精品国产一区二区三区久久久樱花| 亚洲精华国产精华精| 可以免费在线观看a视频的电影网站| 日日夜夜操网爽| 12—13女人毛片做爰片一| 国产蜜桃级精品一区二区三区 | 一级毛片高清免费大全| 欧美性长视频在线观看| 97人妻天天添夜夜摸| 日韩欧美国产一区二区入口| 国产精品二区激情视频| 视频在线观看一区二区三区| 亚洲精品国产一区二区精华液| 亚洲午夜理论影院| 久热爱精品视频在线9| 国产又色又爽无遮挡免费看| 日本a在线网址| 女人精品久久久久毛片| 在线观看舔阴道视频| 十八禁高潮呻吟视频| 免费高清在线观看日韩| 日本一区二区免费在线视频| 一级毛片精品| 一进一出抽搐gif免费好疼 | 国产淫语在线视频| 亚洲av美国av| 久久中文字幕一级| 亚洲一区二区三区欧美精品| 成人国语在线视频| 丰满迷人的少妇在线观看| 欧美精品亚洲一区二区| 午夜视频精品福利| av视频免费观看在线观看| 国产精品九九99| 满18在线观看网站| 中文字幕人妻熟女乱码| 真人做人爱边吃奶动态| 99国产精品一区二区三区| 老司机靠b影院| 亚洲成av片中文字幕在线观看| 欧美日韩视频精品一区| 久久久久久久久免费视频了| 午夜两性在线视频| 国产精品综合久久久久久久免费 | 亚洲av欧美aⅴ国产| 丰满人妻熟妇乱又伦精品不卡| 丰满饥渴人妻一区二区三| av欧美777| 成在线人永久免费视频| 一进一出抽搐动态| xxx96com| 超碰成人久久| 久久精品国产a三级三级三级| 亚洲视频免费观看视频| 99精国产麻豆久久婷婷| 免费在线观看黄色视频的| 国产日韩一区二区三区精品不卡| 九色亚洲精品在线播放| 成年动漫av网址| 欧美在线一区亚洲| 人妻久久中文字幕网| 久久久久国产精品人妻aⅴ院 | 免费观看a级毛片全部| 精品国产乱子伦一区二区三区| 亚洲五月色婷婷综合| 国产欧美日韩一区二区三| av超薄肉色丝袜交足视频| 人人妻人人添人人爽欧美一区卜| 欧美成人午夜精品| 18禁国产床啪视频网站| 亚洲欧美一区二区三区黑人| 视频区图区小说| 美女福利国产在线| 欧美亚洲日本最大视频资源| 天天添夜夜摸| 亚洲avbb在线观看| 黄网站色视频无遮挡免费观看| 一区二区日韩欧美中文字幕| 亚洲精华国产精华精| 午夜免费鲁丝| 妹子高潮喷水视频| 欧美午夜高清在线| 久久久久久久精品吃奶| 在线av久久热| 亚洲国产欧美网| 精品亚洲成国产av| 69av精品久久久久久| 国产精品一区二区免费欧美| 麻豆乱淫一区二区| 亚洲成人国产一区在线观看| 国产熟女午夜一区二区三区| 韩国av一区二区三区四区| 巨乳人妻的诱惑在线观看| av网站免费在线观看视频| 精品一区二区三区av网在线观看| 亚洲熟女毛片儿| 日本黄色日本黄色录像| 国产成人一区二区三区免费视频网站| 久久人妻av系列| 在线观看免费视频日本深夜| 在线天堂中文资源库| 亚洲免费av在线视频| 一二三四在线观看免费中文在| 在线观看一区二区三区激情| 性色av乱码一区二区三区2| 久久久精品国产亚洲av高清涩受| av国产精品久久久久影院| 国产精品98久久久久久宅男小说| 一区福利在线观看| 亚洲七黄色美女视频| 99热网站在线观看| 亚洲欧美一区二区三区久久| 18禁观看日本| 亚洲精品粉嫩美女一区| 欧美色视频一区免费| 亚洲va日本ⅴa欧美va伊人久久| 亚洲片人在线观看| 国产av又大| 丝袜美足系列| 国产又爽黄色视频| 青草久久国产| 黑人操中国人逼视频| 午夜福利一区二区在线看| 在线观看免费午夜福利视频| 99国产精品一区二区三区| 国产蜜桃级精品一区二区三区 | 国产成+人综合+亚洲专区| av欧美777| 免费在线观看亚洲国产| 欧美性长视频在线观看| 国产主播在线观看一区二区| 欧美乱妇无乱码| 看免费av毛片| 国产亚洲精品第一综合不卡| 99国产综合亚洲精品| 99久久综合精品五月天人人| 一级a爱视频在线免费观看| 啦啦啦 在线观看视频| 中出人妻视频一区二区| 亚洲熟女毛片儿| 国产精品一区二区在线观看99| 美女扒开内裤让男人捅视频| 欧美亚洲日本最大视频资源| 日韩欧美免费精品| av天堂久久9| 婷婷精品国产亚洲av在线 | 日日爽夜夜爽网站| 精品亚洲成a人片在线观看| 一级片免费观看大全| 91在线观看av| 日韩欧美一区二区三区在线观看 | 久久草成人影院| 久热爱精品视频在线9| 精品亚洲成a人片在线观看| 又大又爽又粗| 色综合欧美亚洲国产小说| 老司机靠b影院| 亚洲黑人精品在线| 91老司机精品| 亚洲第一av免费看| 国产精华一区二区三区| 咕卡用的链子| 天堂动漫精品| 成人三级做爰电影| 精品电影一区二区在线| 久久亚洲真实| 精品卡一卡二卡四卡免费| 80岁老熟妇乱子伦牲交| 一进一出抽搐动态| 久热爱精品视频在线9| 久久国产精品大桥未久av| 正在播放国产对白刺激| 国产男靠女视频免费网站| 午夜福利在线观看吧| 9191精品国产免费久久| 精品视频人人做人人爽| 久久久国产成人免费| 欧美乱色亚洲激情| 亚洲精品美女久久av网站| 少妇的丰满在线观看| 12—13女人毛片做爰片一| 99国产综合亚洲精品| 黑人欧美特级aaaaaa片| 日本欧美视频一区| 国产欧美日韩综合在线一区二区| 9热在线视频观看99| 美女高潮到喷水免费观看| 高清在线国产一区| 日韩免费av在线播放| 久久精品成人免费网站| 久久精品亚洲熟妇少妇任你| 女性生殖器流出的白浆| 国产精品一区二区在线不卡| 精品久久久久久,| 黄色片一级片一级黄色片| 国产一区二区三区视频了| 这个男人来自地球电影免费观看| 日韩免费高清中文字幕av| 天堂√8在线中文| 久久ye,这里只有精品| 欧美在线黄色| 欧美日韩一级在线毛片| 亚洲欧美激情在线| 亚洲精品中文字幕一二三四区| 久久 成人 亚洲| 欧美黄色片欧美黄色片| 成人三级做爰电影| 女人高潮潮喷娇喘18禁视频| 久久香蕉激情| 中文亚洲av片在线观看爽 | 久久九九热精品免费| 欧美性长视频在线观看| 久久久国产精品麻豆| 国产精品 欧美亚洲| 一区福利在线观看| 多毛熟女@视频| videos熟女内射| 久久久久久亚洲精品国产蜜桃av| 亚洲精品中文字幕在线视频| 电影成人av| 亚洲自偷自拍图片 自拍| 欧美日韩中文字幕国产精品一区二区三区 | 欧美亚洲日本最大视频资源| 交换朋友夫妻互换小说| 又黄又粗又硬又大视频| 亚洲av日韩精品久久久久久密| 搡老乐熟女国产| av在线播放免费不卡| 精品久久久久久电影网| 欧美 日韩 精品 国产| 日韩大码丰满熟妇| a级毛片黄视频| 国产av精品麻豆| 一级毛片高清免费大全| 日本精品一区二区三区蜜桃| 夫妻午夜视频| 无遮挡黄片免费观看| 精品无人区乱码1区二区| 国产成人欧美| 两性夫妻黄色片| 女同久久另类99精品国产91| 欧美在线一区亚洲| 午夜福利影视在线免费观看| 久久久久久久午夜电影 | 老司机福利观看| 久久久久久久国产电影| 亚洲精品自拍成人| 又黄又爽又免费观看的视频| videosex国产| 精品久久久久久久久久免费视频 | 看片在线看免费视频| 午夜免费观看网址| 777米奇影视久久| av中文乱码字幕在线| 成年人黄色毛片网站| 97人妻天天添夜夜摸| 欧美激情高清一区二区三区| 欧美最黄视频在线播放免费 | 亚洲国产精品合色在线| 叶爱在线成人免费视频播放| 日日爽夜夜爽网站| 亚洲精品美女久久久久99蜜臀| 久久国产精品人妻蜜桃| 18禁国产床啪视频网站| 高清在线国产一区| 日韩欧美一区视频在线观看| 亚洲avbb在线观看| 久久精品熟女亚洲av麻豆精品| 成人国语在线视频| 国产男靠女视频免费网站| 国产国语露脸激情在线看| 国产精品久久久久久精品古装| 这个男人来自地球电影免费观看| 热99国产精品久久久久久7| 性色av乱码一区二区三区2| 香蕉久久夜色| 国产精品av久久久久免费| 欧美最黄视频在线播放免费 | 不卡av一区二区三区| 人人妻人人澡人人爽人人夜夜| 亚洲午夜理论影院| 色94色欧美一区二区| 国产99久久九九免费精品| 啦啦啦 在线观看视频| 亚洲 欧美一区二区三区| 99久久人妻综合| 久久香蕉激情| 久久久久久久国产电影| bbb黄色大片| www.自偷自拍.com| 亚洲成a人片在线一区二区| av中文乱码字幕在线| 最新在线观看一区二区三区| 女人高潮潮喷娇喘18禁视频| 日本vs欧美在线观看视频| 色在线成人网| 亚洲综合色网址| 一a级毛片在线观看| 丰满人妻熟妇乱又伦精品不卡| 国产主播在线观看一区二区| av一本久久久久| 久久精品国产亚洲av香蕉五月 | www.自偷自拍.com| 性少妇av在线| 午夜福利视频在线观看免费| 久久久国产成人精品二区 | 婷婷精品国产亚洲av在线 | 黑人巨大精品欧美一区二区蜜桃| 热99re8久久精品国产| av天堂在线播放| 操美女的视频在线观看| 国产真人三级小视频在线观看| 亚洲国产中文字幕在线视频| 午夜福利乱码中文字幕| 91大片在线观看|