• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Hybrid Sensor Selection Technique for Lifetime Extension of Wireless Sensor Networks

    2022-03-14 09:24:42KhaledFouadBasmaHassanandOmarSalim
    Computers Materials&Continua 2022年3期

    Khaled M.Fouad,Basma M.Hassanand Omar M.Salim

    1Information Systems Department,Faculty of Computers and Artificial Intelligence,Benha University,Benha,13518,Egypt

    2Faculty of Artificial Intelligence,Kafrelsheikh University,Kafrelsheikh,33516,Egypt

    3Electrical Engineering Department,Benha Faculty of Engineering,Benha University,Benha,13518,Egypt

    Abstract:Energy conservation is a crucial issue to extend the lifetime of wireless sensor networks (WSNs) where the battery capacity and energy sources are very restricted.Intelligent energy-saving techniques can help designers overcome this issue by reducing the number of selected sensors that report environmental measurements by eliminating all replicated and unrelated features.This paper suggests a Hybrid Sensor Selection (HSS) technique that combines filter-wrapper method to acquire a rich-informational subset of sensors in a reasonable time.HSS aims to increase the lifetime of WSNs by using the optimal number of sensors.At the same time,HSS maintains the desired level of accuracy and manages sensor failures with the most suitable number of sensors without compromising the accuracy.The evaluation of the HSS technique has adopted four experiments by using four different datasets.These experiments show that HSS can extend the WSNs lifetime and increase the accuracy using a sufficient number of sensors without affecting the WSN functionality.Furthermore,to ensure HSS credibility and reliability,the proposed HSS technique has been compared to other corresponding methodologies and shows its superiority in energy conservation at premium accuracy measures.

    Keywords: Energy conservation; WSNs; intelligent techniques; sensor selection

    1 Introduction

    Wireless Sensor Networks (WSNs) could be described as a group of connected sensor nodes [1], which are small-sized devices with restricted resources such as power supply and memory.In a traditional sensor network, every node should monitor physical environmental prerequisites like sound, temperature, pressure, humidity, motion, light, vibration, etc.In a sensor network [2], sensors can co-operate to rise out a particular purpose because of the nature of the observed parameters and the enormous quantity of implemented sensors.The generated information from the sensor network is quite correlated, and reporting every individual sensor reading is a waste of energy resources.Besides, many motives had highlighted the significance of machine learning methodologies’adoption in WSNs applications to learn about and discover correlated data, predictions, decisions, and information or data classification.The first reason is that sensor nodes aren’t operating as predicted due to sudden environmental behavior.The second reason is because of the unpredictable environments where WSNs are distributed.

    The third reason is that sensor nodes produce massive quantities of correlated and replicated data.The energy-efficient conservation scheme was illustrated [3] to prolong the network lifetime.A lot of energy in sensor nodes is wasted in data communication.Hence, decreasing the quantity of unnecessary communication helped minimize energy waste and extended the overall network lifespan.Furthermore, designing a high-quality strength management scheme is one of the principal challenges of WSNs.Many research efforts have been spent using intelligent techniques based totally on machine learning.It has been efficiently utilized for various scientific purposes, specifically with linear kernel, which can analyze and construct the required information from fewer training samples, nevertheless offers excessive classification accuracy.The classification algorithms,such as K-Nearest Neighbor (K-NN) [4], Support Vector Machine (SVM) [5], and Random Forest(RF) [6], are adopted extensively and investigated in the region of machine learning.SVM is additionally regarded as a too effective algorithm in the subject of data-mining [7].

    The problem statement stems from energy in WSNs which is a rare commodity, particularly in cases wherever it would be hard to supply additional energy sources if the available energy were used up.Even in scenarios where energy harvesting is possible, efficient energy usage remains a crucial objective to extend the life of the network.Therefore, the main goal in designing WSNs are to keep the energy consumption of sensors as low as possible since the main issue is the limited battery capacity of sensors.The major objective of the above methods is to raise the network performance by eliminating noisy, redundant, and irrelevant features.In this paper, the number of features is reduced to limit the energy consumption of sensors.Therefore, extending the lifetime of the network, while maintaining a certain level of classification accuracy.

    The paper assumes that sensors in a wireless network are only activated on demand from the base station (sink node) of the underlying system (platform).Moreover, each sensor measures one feature from the environment and returns its value to the base station, where the proposed HSS is executed.Hence, classifier training and validation are executed on the base station as well because the ML techniques will increase energy consumption if they are implemented separately in each node of the network.

    The paper contributions are summarized below:

    ? Provides detailed earlier energy efficiency methods related to working in WSNs based on intelligent ML models and algorithms.

    ? Suggest a Hybrid Sensor Selection (HSS) technique that minimizes the number of sensors to reduce the sensors’energy consumption and increase the network’s lifetime maintaining a proven phase of classification accuracy.

    ? This HSS technique combines the filter-wrapper method to acquire an informative subset of sensors in a reasonable time.

    ? Four dissimilar and extensive experiments were conducted to demonstrate the effectiveness of the proposed technique for improving the WSNs lifespan.

    ? The proposed HSS technique is compared with existing approaches on measurement tasks,datasets and its superiority is highlighted.

    The rest of this paper is organized as follows.Section 2 reviews the state of the art in sensor selection and energy-efficient schemes based on intelligent models WSNs.The main contribution of this paper is presented in Section 3 through the HSS proposed technique.Section 4 discusses simulation experiments and setup.Simulation results are summarized in Section 5.Finally,Section 6 concludes the paper and highlights future research directions.

    2 Related Work

    The following part of this survey will provide a literature review of recent approaches proposed for energy conservation for WSNs based on machine learning and intelligent energy-saving models and energy conservation through energy management has lengthy records or history.

    Discriminative extracted features are used to enhance the computational efficiency of the suggested model using Na?ve Bayes, multilayers perceptron (MLP), and SVM.The intelligent energy management models have been introduced to maximize WSNs lifetime and decrease the number of chosen sensors to environmental measurements achieving high energy-efficient network while keeping the preferred degree of accuracy in the mentioned readings.This selection approach has ranked sensors based on the importance of their usage, from the best to the least important using three intelligent models based on Na?ve Bayes, MLP, and SVM classifiers.The MATLAB simulation outcomes showed that Linear SVM chosen sensors produced higher energy efficiency than those chosen through Na?ve Bayes and MLP for equal Lifetime Extension Factor (LEF) of WSNs [8].

    Other associated research work offered a massive range of selection criteria to handle characteristic importance, like Laplacian score and Fisher rating [9].These techniques addressed several combinatorial optimization bottlenecks, such as local optimality and costly computational value.A hybrid methodology was proposed [10] based on wavelet transform (WT) and mutual information change (MIC) for faulty sensor identification.The results showed that approaches WT and MIC obtained a more considerable accuracy in most fault varieties.Another hybrid model was proposed [11] using Rough Set Theory (RST), Binary Grey Wolf Optimization Algorithm,and Chaos Theory for feature selection problems.Results indicated that the proposed method showed higher performance, more incredible speed, lower error, and shorter execution time.

    The partly informed sparse auto encoder (PISAE) that primarily based energy conservation is brought in [12] for prolonging the WSNs lifetime.PISAE aimed to improve lifespan by inserting sensors with redundant readings to sleep without dropping big information.PISAE overall performance was tested on three specific datasets using MLP, Na?ve Bayes, and SVM classifiers.The decision scheme was proposed [13] to minimize the strength consumption of WSNs.The sensors were sorted top-down from the most to the least based on the importance and usage in WSNs.Hence, the Na?ve Bayes method has been used.The later stated method has been tested on many popular actual datasets.The results have shown that extra energy is fed on if excellent sensors are used, then the sensor network lifetime is minimized.The linear SVM [14] performed better compared to most of machine learning classifiers presented.More interest has been dedicated to this classifier for its outstanding performance in classification accuracy phrases.Therefore, this intelligent classifier has been utilized in a variety of applications.The energy management in WSNs was proposed [15] to reduce the number of specific sensors that report measurements.The results indicated that the multilayer perceptron MLP algorithm achieved a significant accuracy enhancement compared to Na?ve Bayes during the same lifetime extension factor.

    In [16] energy-saving framework for WSNs was proposed using machine learning techniques and meta-heuristics used two-phase energy savings on the sensor nodes.The first phase, networklevel energy-saving, was achieved by finding the minimum sensor nodes needed.To find the minimum sensor nodes, they applied hybrid filter wrapper feature selection to find the best feature subsets and found the measures don’t produce significant performance differences from 20%.In the second phase, they achieved energy savings of the WSNs by manipulating the sampling rate and the transmission interval of the sensor nodes to achieve node level energy saving.They proposed an optimization method based on Simulated Annealing (SA), but this approach reduced energy consumption by more than 90%.But, this work has some limitations such as it has a lack of scalability to handle changes in the WSNs and the second limitation is due to bad routing protocols and complex topology their approach may be infeasible in the real world.

    3 The Proposed Technique

    The intelligent technique for energy-efficient conservation in WSNs is introduced using classification methodologies.In this paper, Sensor Discrimination, Ionosphere, ISOLET, and Forest Cover Type datasets have been used to challenge the superiority of the intelligent algorithms in phrases of classification accuracy and the Lifetime Extension Factor (LEF) that is shown in Eq.(1).

    The lifetime extension factor is the ratio between the total number of network sensors and the number of sensors used by HSS for a specific experiment:

    Understandably, LEF minimum value is limited to 1, which reflects that the network administrator always uses all sensors in each classification step, which should reveal the highest possible accuracy measure.Contrarily, if fewer sensors are selected/used for that specific network, the value of LEF will be greater than 1; as a result the network lifespan will be extended.In addition,one of the research assumptions is that each sensor can be used properly M times before it goes unavailable, which also reflects the sensor lifetime based on its energy consumed in measurement and network connectivity.The key point here is to save sensor/network energy for irrelevant or redundant sensors.That reveals that, LEF is inversely proportional to the network energy consumption.The higher the LEF value the lower the network energy consumption.If HSS could increase the LEF then the energy conservation of the network will be achieved, while maintaining network functionality and accuracy within the accepted constraints.

    The main purpose of HSS is to choose the most suitable number of features (sensors)in the preprocessing stage based on hybrid statistical (filter-wrapper) methods and examine the proposed technique superiority using the classification accuracy.Two necessary steps are carried out to conserve the energy and prolong the lifetime factor of sensor nodes.First, the most dominant sensor nodes in WSNs had been chosen based on the proposed Hybrid Sensor Selection(HSS) technique.Second, the K-NN, Linear-SVM, and RF classification algorithms are applied for efficient energy conservation in WSNs and delivered through the classification algorithm’s capability by means of using SVM.The proposed technique operates in four stages, as shown in Fig.1.

    3.1 Data Preprocessing

    In the preprocessing procedure, the first step is needed to obtain the related dataset.This dataset should consist of the data collected from several WSNs environment resources.Four datasets have been examined, which are obtained from the public UCI and Kaggle.In the pre-processing stage, as in Fig.2, it is vital to identify and handle the missing records correctly.If not, that may pull inexact and wrong conclusions out of the data.

    Figure 1: General architecture of the proposed technique

    Figure 2: Data preprocessing

    3.2 Hybrid Sensor Selection(HSS)Technique

    The importance of the sensor selection in the proposed HSS technique comes from feature selection in the machine learning algorithm.This work uses the hybrid sensor selection that consists of filter methods and a wrapper method, as shown in Fig.3.The datasets collected after preprocessing data are used as inputs to the HSS technique.The selection of sensors is independent of any machine learning algorithms.Instead, sensors are chosen based on their scores on different statistical tests to correlate with the outcome.This HSS technique consists of four steps to obtain a sub-optimal set of sensors.The first step is a blend of filter techniques that are carried out to reduce the number of sensors.Each filter technique calculates a score for every sensor.

    In the second step, scores are normalized to gain value in the range (0, 1).In the third step,scores are blended by computing the average of the two obtained scores.The calculated average threshold decision is used to isolate the significant sensors, hence, to minimize the dataset.In the fourth step, the dataset consists of only the already selected sensors are subjected to the following step of wrapper method.

    Figure 3: Hybrid sensor selection (HSS) technique

    The first filter method, is the Fisher score algorithm [17].It selects sensors based on their scores using the chi-square method, which finally leads to the very well representative number of sensors that are ranked by their importance.Sensors with the highest Fisher scores as the subset of the optimal sensors are used for furthermore analysis.The second filter method is the ANOVAtest [18].It is used to check the variance between sensors.In addition, compare the average value of more than two sensors by analyzing variances.After completing the two filter methods and set a score to each sensor according to its importance for the target.The dataset output should be normalized to obtain values in the range (0, 1).Hence, they are blended by taking the average value of the two scores calculated.Sensors with a score higher than the average score are selected.

    The final step of the HSS technique is the wrapper method [19] (Exhaustive search method).It is used to select a sub-optimal set of sensors in a reasonable time without affecting the network functionality.The final datasets are extracted by finding the collection of sensors that provide the classifier’s higher accuracy.Then, the LEF of the network is calculated.

    3.3 Classifier Techniques

    The performance evaluation is achieved through the classification output based on both accuracy and lifetime extension factor (LEF).The classifiers K-NN, SVM, and RF, are used for the experimental study, as Fig.4 indicates.

    3.4 Performance Evaluation

    This stage’s result exhibits the proposed HSS performance and its efficacy in LEF and classification accuracy phrases.The three intelligent classification algorithms’overall performance is measured through the confusion matrix usage [20].The matrix provides a vision of how well the classifier will perform on the processed dataset.For each experiment, the repeated k-fold crossvalidation method is used to evaluate the proposed HSS performance using K = 10 with three repeats.Besides, overall performance metrics, like precision, recall, and F1-Score, can be obtained from this matrix.

    In this work, seven different classifiers have been tested.Classifiers are namely Bagging,Random Forest (RF), Extra Trees (ET), Logistic Regression (LR), Decision Tree (DT), SVM, and K-NN.All classifiers showed good accuracy results.The highest accuracy records are accredited for SVM, K-NN, and RF classifiers.These three classifiers have been adopted in this paper to highlight the proposed HSS outperform.

    Figure 4: Classifier techniques

    4 Experiments and Results

    Four experiments were performed on four distinct datasets.Each experiment is divided into three cases.In each case, the performance evaluation, such as the number of sensor selection,LEF, and accuracy, are measured to ensure the proposed HSS performance in different situations.

    Energy-efficient management is mostly conceptualized as mechanisms and strategies by which the overall energy of the network is allocated, arranged, and used effectively by all sensors so that the network remains fully functional for its expected life.Since the WSNs energy is very restricted particularly in remote areas where obtaining additional energy is inconsequential because of the environmental feud.It is effective to achieve discreet equilibrium between energy supply and network load through energy conservation to avoid early energy exhaustion.As a result, designing a network with specific application requirements that satisfies the network lifetime remains a challenge.Individual sensor measurements report with highly correlated and irrelevant sensed data is energy consuming and memory inefficient due to the quality of the monitored conditions and the large number of sensors deployed.Therefore, schemes that manage network energy efficiently,at the data collection phase, are primary to the sustainability of the network.That is achieved by minimizing the amount of needless data collection through communication and hence, that will reduce energy dissipation and prolong the network lifespan.

    4.1 Experimental Setup

    All experiments were performed on AMD A10-8700P Radeon R6, 1.80 GHz, and an 8 GB of RAM running a 64-bit Windows 10 operating system.Software implementation has been performed on Anaconda 2019 using python 3.

    4.2 Datasets

    The four datasets used in this paper are downloaded from UCI and Kaggle machine learning repositories [21].The Ionosphere dataset is a radar dataset gathered through Goose Bay, Labrador.It is grouped with a useful resource of 34 real sensor nodes and has two classes for radar signal.A forest cover type dataset was developed at the University of Colorado.It is used for predicting unknown regions.Sensor Discrimination is a labeled dataset and has three classes “group A,”“group B,” and “false alarm” Besides, ISOLET is a large dataset divided into five parts.A brief description of the datasets is provided in Tab.1.

    Table 1: Description of datasets in-brief

    4.3 Experiment One

    This experiment is performed on the Sensor Discrimination dataset that consists of 12 sensors and 2,212 instances.The recipient node must identify the unknown sample, then tag which category the unknown sample belongs to.Each sensor gets a score indicating its importance concerning the target.Fig.5 illustrates the results using the fisher score based on the chi-square method after normalization.Moreover, it shows that many sensors have a high impact on the output, beginning with sensor F10 and ends with F12.The ANOVA test determines how much sensors are affecting the output.If the sensor’s score is low, this sensor has no impact on the output and vice-versa.Fig.6 shows each sensor’s score using the ANOVA test method after normalization.Also, it indicates that the sensor F12 has the highest impact and F8 has the lowest impact on the output.

    After combining all sensor scores obtained from Fisher and ANOVA filters, the average of all combined sensors scores is calculated (equal to 1.032 in Experiment 1), as shown in Fig.7.This average can be considered as the threshold decision.Hence, sensors’scores are compared with the threshold value, and only sensors have scores greater than the threshold will pass to the next step.This process minimizes the dataset and extracts only the interesting sensors.This reduced dataset is used as an input data to the Wrapper exhaustive method to find the best combination of input sensors.The exhaustive search starts with looking for the best one-component subset of the input sensors.After that, it discovers the best two-component sensor subset, which might also consist of any pairs of the input sensors.Afterward, it continues to find the best triple out of all the collections of any three input sensors, etc.The best sensors selected are then used to evaluate the classifier accuracy level.

    Figure 5: Fisher output after normalization for experiment one

    Figure 6: ANOVA test output after normalization for experiment one

    Figure 7: All sensors score after combining two methods for experiment one

    Tab.2 indicates the performance evaluation parameters, such as sensor selection, LEF, and accuracy before applying the proposed HSS technique.In experiment one regarding the sensor discrimination dataset; all 12 sensors were used before applying HSS.This can be considered the network LEF equal to 1, and the accuracy is 99%, 99%, and 98% for K-NN, RF, and SVM,respectively.However, after the HSS implementation, three cases were used to increase the network LEF with an acceptable accuracy level.For example, in case 1, with 2 out of 12 sensors are selected, the network LEF is increased to 6 times with an accuracy of 84%, 84%, and 83% for K-NN, RF, and SVM, respectively.While the best two sensors are F11 and F10 In cases 2 and 3, the network LEF increased to 4 and 2.4 times, respectively, with sensor selection of 4 and 5 out of 12.

    Table 2: Performance evaluation before and after the HSS implementation for experiment one

    Figure 8: Fisher output after normalization for experiment two

    4.4 Experiment Two

    This experiment is performed on the Ionosphere dataset.It is a radar dataset gathered from 34 several real sensor nodes and 351 instances/records.The same steps that were taken in experiment one are further implemented here.Fig.8 shows fisher score output after normalization for experiment two.Fig.9 shows the normalized ANOVA test output and the sensors’score from f1 to f34.Many sensors take a score higher than 0.5, and this means several sensors will be chosen as the best.After combining these two filter methods, as in Fig.10, the threshold decision is equal to 0.811.The sensors that have the highest output will be extracted.This process nominates a reduced dataset fed to the exhaustive search method to choose the best-selected number of sensors.

    Figure 9: ANOVA test output after normalization for experiment two

    Figure 10: All sensors score after combining two methods for experiment one

    In experiment two, all 34 sensors were used before applying the proposed HSS technique.That is considered the network LEF equal to 1, and the accuracy is 84%, 92%, and 92% for K-NN,RF, and SVM, respectively.After the HSS implementation, three cases were used to increase the network LEF with an acceptable accuracy level.In case 1, 5 out of 34 sensors are selected, the network LEF is increased by 6.6 times with an accuracy of 81%, 81%, and 85% for K-NN,RF, and SVM, respectively.In cases 2 and 3, the network LEF increased by 3.3 and 2.2 times,respectively, with sensor selection of 10 and 15 and the accuracy level close to the results before applying the proposed HSS technique.All results obtained are shown in Tab.3.

    Table 3: Performance evaluation before and after the HSS implementation for experiment two

    4.5 Experiment Three

    The motivation of ISOLET dataset is to predict which letter or name would be spoken.The ISOLET is a massive dataset with 7797 records and 617 sensors.It is divided into five parts.In this work, the Isolet5 dataset will only be used with 1559 records and 617 sensors due to the memory size limitation.For that specific reason, Fisher and ANOVA results were not presented because it has a lot of sensors to be shown, unlike the previous two datasets.For simplicity, the results will be mentioned after combining the two filter methods.

    The proposed technique nominates 174 sensors that have a score higher than the average score of all sensors (threshold=0.507).These 174 sensors are fed to the Exhaustive search method to select the best 10, 20, and 30 sensors as illustrated in Tab.4.

    Table 4: Performance evaluation before and after the HSS implementation for experiment three

    4.6 Experiment Four

    Forest Cover is one of the biggest datasets processed in this work.The first 25000 recordings were chosen due to limited memory size.Unfortunately, these specific records were contained Nan values in some attributes like A15 and A25 sensors.Therefore, it was more convenient to select records in another way by sampling the whole dataset to overcome this issue.An (i + 1) sampling was convenient with a step of A15 didn’t provide Nan values, and the number of selected records was about 39000 records.The same steps that were taken in experiments one, two, and three have been performed here.Fig.11 shows the Fisher score output after normalization for experiment four.Eleven sensors achieved a high impact on the output, beginning with sensor A0 and ends with A53.

    Figure 11: Fisher output after normalization for experiment four

    Fig.12 shows the highest two sensors’scores, A5 and A9, according to the ANOVA test after normalization for experiment four.The average is similarly compared to nominate a reduced dataset consisting of 11 sensors, as shown in Fig.13.This reduced data output was fed to the exhaustive search method and achieve A0, A5, and A9 as the best three sensors selected.

    Figure 12: ANOVA test output after normalization for experiment four

    Figure 13: All sensors score after combining two methods for experiment four

    In experiment four, all 54 sensors were used before applying the proposed HSS technique(LEF equal to 1), with accuracy of 86%, 90%, and 41% for K-NN, RF, and SVM, respectively.After the HSS implementation, three cases were considered to increase the network LEF with an acceptable accuracy level.In case 1, 3 out of 54 sensors are selected, the network LEF is increased by 18 times with an accuracy of 80%, 82%, and 40% for K-NN, RF, and SVM, respectively, where the best three sensors are A0, A5, and A9.In cases 2 and 3, the network LEF increased by 9 and 6.75 times, respectively, with 6 and 8 selected sensors.The accuracy level was close to the results before applying the proposed HSS technique.All obtained results are shown in Tab.5.

    Table 5: Performance evaluation before and after the HSS implementation for experiment four

    5 Results and Discussions

    The intelligent classification algorithms’overall performance is measured through the use of the confusion matrix.This matrix shows a vision of how well the classifier would perform on the entered dataset.A variety of performance measures like accuracy, precision, recall, and F1-score,can be obtained from this matrix.Tab.6 indicates the structure of the confusion matrix.Results of True Positive (TP), False Positive (FP), False Negative (FN), and True Negative (TN) are obtained.Accuracy is one measure to evaluate classification systems.It can be defined as the fraction of predictions that the model gets right.The definition of the accuracy as Eq.(2) shows:

    Table 6: Structure of the confusion matrix

    In classification, the accuracy may be computed using positives and negatives measurements as Eq.(3):

    True Positives (TP) is the items where the actual class is positive and classified properly predicted to be positive.Similarly, True Negatives (TN) is the items where the actual class is negative and classified properly predicted to be negative.Contrarily, False Positives (FP) is the items where the actual class is negative and classified incorrectly predicted to be positive.Similarly,False Negatives (FN) is the items where the actual class is positive and classified incorrectly predicted to be negative.Tabs.7 and 8 show the confusion matrices for Linear-SVM over the sensor discrimination dataset with a selection of 2, 4, and 5 out of 12 sensors.The sum of the diagonals represents the number of properly classified samples.For instance, the total number of properly classified samples for SVM is 366, that is, the sum of 185, 146, and 35.In this case, the accuracy is 366/443 × 100 = 83%, where the total number of instances in the testing dataset is 443.The accuracy calculated is 84% for K-NN with selection 2 out of 12 sensors.The same steps are applied for the remaining cases for more selected sensors with the proposed classifiers.

    Table 7: SVM confusion matrix for experiment one with 2 of 12 selections

    Tab.9 illustrates Linear-SVM confusion matrices over the sensor discrimination dataset to select 5 out of 12 sensors.The total number of properly classified samples for SVM is 434 (sum of 201, 152, and 81).In this case, the accuracy is 427/443 × 100 = 98%, while the total number of records in the testing dataset is 443.Similarly, the accuracy calculated for K-NN is 99%.

    Table 8: SVM confusion matrix for experiment one with 4 of 12 selections

    Table 9: SVM confusion matrix for experiment one with 5 of 12 selections

    The classification output shows precision, recall, and F1-score [22] given by SVM for the sensor discrimination dataset, selecting 2 out of 12 sensors shown in Tab.10.Therefore, whenever building a model, the following parameters should help figure out how well the proposed model has performed.The precision is the rate of adequately predicted positive classes to all the predicted positive classes as Eq.(4).That reveals; of all classes labeled as group A, is how many are classified as group A.The high precision is related to the low rate of false positive.In this situation, it gives 0.91.

    Table 10: Classification output of SVM for experiment one with a selection 2 of 12 sensors

    The recall is the rate of properly predicted positive classes to all observations in actual classes.That reveals; of all given classes in group A, It gets a recall of 0.94 that is suitable for this example as it’s greater than 0.5, as Eq.(5) shows.

    F1 score is the weighted-average of recall and precision.Hence, this value takes both FP and FN into consideration.It is unclear to be understood as accuracy; however, F1 is often more helpful than accuracy, particularly, if there is a dissimilar class classification.Accuracy acts better,whether FN and FP have identical costs.If either FN’s or FP’s value is very distinct, it’s preferable to observe recall and precision.In group A case, the F1 score is 0.93, as Eq.(6) shows.

    The number of selected sensors impacts the network LEF.If the selected sensor increases,the network LEF will be decreased.In the same time, the accuracy will be impacted during this process, and the network designers have to trade-off the accuracy for LEF using the best-tuned sensor selection.Fig.14 shows the LEF and the number of sensor selections for all previous experiments.Researchers usually focus on the accuracy of the machine learning technique that they are using.Some of them, however, compare their findings to other available methods.As for the time performance, there are two parts.The first part is measuring the total time that is taken to build the training technique.The second part is measuring the total time that is taken to find the results based on the test data and both training and test data time are combined.Obviously,the accuracy results before HSS is better than after implementing HSS while, the reasons can be described as follows:

    ? The WSNs aren’t operating with 100% of sensors.

    ? The WSNs designers or administrators have to trade-off the accuracy and the network lifetime by using the best-tuned sensor selection.

    ? Exploiting the acceptable tolerance in accuracy was the main idea of this research work.It is clearly demonstrated in the performance evaluation tables before and after the HSS implementation, the lifetime extension through energy conservation maintaining an acceptable accuracy measure.

    Figure 14: Lifetime extension factor of all experiments

    Table 11: Average time (in seconds) used to build the proposed HSS for experiment one

    Table 12: Comparison between the proposed HSS and previous work

    Tab.11 shows the average time, in seconds, taken before and after building the proposed HSS.The RF takes the largest time, 0.210 s, to build the technique before using the proposed HSS.After using the proposed HSS, RF takes less time in the three sensor selection cases.Also, the RF classifier takes more time compared to SVM and K-NN.For example, a longer time to experiment one accrued when five sensors were chosen.In this case, RF increased the time by a factor of 0.089 s compared with the K-NN classifier and 0.052 s compared with SVM.It is clear that the proposed HSS reduces the computational time required in each selection case.

    Tab.12 shows the comparison between the proposed HSS and the previous work in WSNs energy conservation.This comparison used four datasets, namely Sensor discrimination, Ionosphere, ISOLET, and Forest cover type.In each dataset, there are three cases to demonstrate the superiority of the proposed HSS technique.In each case, three evaluation factors were used to compare between five studies regarding the number of sensor selections, the network LEF, and the accuracy level.

    Experimental results indicated that, using the best classifier results in case 1 for the sensor discrimination dataset, the HSS technique reduced the number of selected sensors by 1 and doubled the network LEF.Moreover, it has achieved a higher accuracy level compared to other studies in the literature.Similar results are achieved for cases 2 and 3.In the Ionosphere dataset,the proposed HSS technique used half the number of sensors used in the previous work and can double the network LEF with an acceptable accuracy level.Furthermore, in the ISOLET dataset,HSS used the same selected sensors in [12] and presented the same LEF with a higher accuracy.The superiority of HSS demonstrated that in Forest cover type dataset, using a number of sensors less than 50% of used sensors in [7,12,14] and double LEF with higher accuracy level.

    6 Conclusions and Future Work

    This paper proposes an intelligent HSS technique for efficient-energy conservation in WSNs.Besides, it selects a subset of sensors to increase the network lifetime by removing redundant,noisy, or irrelevant sensors.The proposed HSS technique combines two filter methods and a wrapper method for sensor selection.The HSS technique provides a larger informative subset at an acceptable time.Furthermore, it could be used in all kinds of datasets with no prior assumption of the data and appropriate to big or imbalanced datasets.The HSS has been used effectively on several datasets from the UCI and Kaggle repositories such as Sensor Discrimination, Ionosphere,ISOLET, Forest Cover Type, and.In each dataset, three cases were considered to demonstrate the superiority of the HSS technique.Moreover, in each case, three evaluation factors were highlighted to compare five different previous works in terms of the number of sensors selected,the network LEF, and the accuracy level.Evaluation results indicated that the proposed HSS technique could double the network lifetime and increase the accuracy by more than 20% with the minimum number of sensors corresponding to the previous methodologies without affecting the network functionality.For future work, the classifier’s performance using different benchmark datasets with several kernels, like sigmoid or Gaussian, will be investigated.Moreover, this work will be committed to various purposes such as prediction and clustering purposes.It would also be desirable to extend the experiments to include more datasets acquired from real WSNs high-dimensional data.

    Funding Statement:The authors received no funding for this study.

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.

    国产av又大| 久久伊人香网站| 久久热在线av| 亚洲国产精品合色在线| 亚洲精品在线观看二区| 亚洲成av片中文字幕在线观看| 久久中文字幕一级| 成人国语在线视频| 精品高清国产在线一区| 亚洲专区中文字幕在线| 国产精品日韩av在线免费观看 | 久久精品国产综合久久久| 亚洲中文字幕日韩| 日本 av在线| 最近最新中文字幕大全免费视频| 精品无人区乱码1区二区| 看免费av毛片| 欧美一区二区精品小视频在线| 宅男免费午夜| 如日韩欧美国产精品一区二区三区| 国产精品永久免费网站| 国语自产精品视频在线第100页| 最好的美女福利视频网| 国产亚洲精品久久久久5区| 国产欧美日韩综合在线一区二区| 日日摸夜夜添夜夜添小说| 亚洲 欧美 日韩 在线 免费| www.999成人在线观看| 国产成人影院久久av| 精品久久久久久成人av| 午夜激情av网站| 成人三级黄色视频| 久久中文字幕人妻熟女| 大码成人一级视频| 老司机在亚洲福利影院| 久久精品国产清高在天天线| 黄片大片在线免费观看| 精品国产乱子伦一区二区三区| 亚洲成人免费电影在线观看| 国产精品二区激情视频| 午夜福利视频1000在线观看 | 国产欧美日韩精品亚洲av| 嫁个100分男人电影在线观看| 午夜成年电影在线免费观看| 精品少妇一区二区三区视频日本电影| 精品乱码久久久久久99久播| 午夜精品在线福利| 国产精品野战在线观看| 人人妻,人人澡人人爽秒播| 久久久久久国产a免费观看| 国产欧美日韩一区二区精品| 免费一级毛片在线播放高清视频 | 欧美色欧美亚洲另类二区 | cao死你这个sao货| www.精华液| 成人国产综合亚洲| 国产一卡二卡三卡精品| 久久精品成人免费网站| 国内久久婷婷六月综合欲色啪| 手机成人av网站| 国产精品香港三级国产av潘金莲| 这个男人来自地球电影免费观看| 中文字幕最新亚洲高清| 久久精品91蜜桃| 在线免费观看的www视频| 久久精品成人免费网站| 国产激情欧美一区二区| 国产av在哪里看| or卡值多少钱| 91麻豆av在线| 一本大道久久a久久精品| 日韩成人在线观看一区二区三区| 欧美在线一区亚洲| 国产日韩一区二区三区精品不卡| 好男人电影高清在线观看| 国产精品日韩av在线免费观看 | 国产xxxxx性猛交| 狠狠狠狠99中文字幕| 亚洲第一欧美日韩一区二区三区| 久久精品亚洲熟妇少妇任你| 亚洲精品久久成人aⅴ小说| 69av精品久久久久久| 午夜成年电影在线免费观看| 亚洲电影在线观看av| 国产精品一区二区三区四区久久 | x7x7x7水蜜桃| 美女国产高潮福利片在线看| 少妇被粗大的猛进出69影院| 1024视频免费在线观看| 啦啦啦韩国在线观看视频| 亚洲国产欧美日韩在线播放| 中文字幕另类日韩欧美亚洲嫩草| 国内久久婷婷六月综合欲色啪| 亚洲欧美日韩另类电影网站| 亚洲熟妇熟女久久| 亚洲成a人片在线一区二区| 美国免费a级毛片| 亚洲成人精品中文字幕电影| av视频在线观看入口| 久久伊人香网站| 久久香蕉精品热| 中文字幕人妻丝袜一区二区| 制服诱惑二区| 91九色精品人成在线观看| 久久人人精品亚洲av| 国产伦人伦偷精品视频| 免费久久久久久久精品成人欧美视频| 久久久久久免费高清国产稀缺| 91精品三级在线观看| 欧美人与性动交α欧美精品济南到| 欧美日韩福利视频一区二区| 好男人在线观看高清免费视频 | 90打野战视频偷拍视频| 久久人妻福利社区极品人妻图片| 成人亚洲精品一区在线观看| 中文字幕色久视频| 97人妻天天添夜夜摸| 久热这里只有精品99| 18美女黄网站色大片免费观看| 午夜久久久久精精品| 动漫黄色视频在线观看| 久久久久久国产a免费观看| e午夜精品久久久久久久| 欧美乱色亚洲激情| 美女午夜性视频免费| 69av精品久久久久久| 久久影院123| 亚洲精品国产区一区二| 免费高清在线观看日韩| 亚洲国产中文字幕在线视频| 国产精品综合久久久久久久免费 | 黄片播放在线免费| 此物有八面人人有两片| 男女下面插进去视频免费观看| 久久精品国产清高在天天线| 性少妇av在线| 少妇 在线观看| 国产欧美日韩一区二区三| 青草久久国产| 麻豆一二三区av精品| 无遮挡黄片免费观看| 人成视频在线观看免费观看| 啦啦啦观看免费观看视频高清 | 国产麻豆成人av免费视频| 国产片内射在线| 欧美性长视频在线观看| 他把我摸到了高潮在线观看| 精品欧美国产一区二区三| 精品久久久久久久人妻蜜臀av | 久99久视频精品免费| 欧美乱码精品一区二区三区| 国产免费av片在线观看野外av| 好男人在线观看高清免费视频 | 免费看a级黄色片| 亚洲国产精品成人综合色| 亚洲精品国产色婷婷电影| 欧美亚洲日本最大视频资源| 亚洲av熟女| 中文字幕最新亚洲高清| 91成人精品电影| 久久精品国产亚洲av高清一级| 亚洲天堂国产精品一区在线| 伊人久久大香线蕉亚洲五| 亚洲七黄色美女视频| 亚洲精品国产色婷婷电影| 久久久久久人人人人人| 日韩av在线大香蕉| 一级毛片精品| 在线永久观看黄色视频| 免费在线观看亚洲国产| 亚洲电影在线观看av| 露出奶头的视频| 老司机福利观看| 亚洲熟女毛片儿| 91成年电影在线观看| 亚洲男人的天堂狠狠| 国产99白浆流出| 免费高清在线观看日韩| 黑人欧美特级aaaaaa片| 美女午夜性视频免费| 欧美乱妇无乱码| 在线观看舔阴道视频| 国产主播在线观看一区二区| 国产麻豆成人av免费视频| 亚洲一区中文字幕在线| 99精品欧美一区二区三区四区| 美女 人体艺术 gogo| 亚洲国产精品合色在线| 午夜福利免费观看在线| 久久伊人香网站| 999久久久国产精品视频| 国产伦人伦偷精品视频| 人妻丰满熟妇av一区二区三区| 人人妻,人人澡人人爽秒播| 日本 欧美在线| 日日摸夜夜添夜夜添小说| 国产又色又爽无遮挡免费看| 18禁黄网站禁片午夜丰满| 欧美精品啪啪一区二区三区| 长腿黑丝高跟| 丰满人妻熟妇乱又伦精品不卡| 99久久久亚洲精品蜜臀av| 午夜亚洲福利在线播放| 免费一级毛片在线播放高清视频 | 国产精品一区二区免费欧美| 在线观看一区二区三区| 日韩一卡2卡3卡4卡2021年| 久久国产精品影院| 久久人妻福利社区极品人妻图片| 可以在线观看毛片的网站| 欧美av亚洲av综合av国产av| bbb黄色大片| 国产精品国产高清国产av| 长腿黑丝高跟| 中文字幕最新亚洲高清| 男女午夜视频在线观看| 午夜精品久久久久久毛片777| 色婷婷久久久亚洲欧美| 成人亚洲精品一区在线观看| 我的亚洲天堂| 一区二区日韩欧美中文字幕| 桃色一区二区三区在线观看| 精品电影一区二区在线| 嫁个100分男人电影在线观看| 搡老熟女国产l中国老女人| 久久婷婷人人爽人人干人人爱 | 国产高清激情床上av| 黄色片一级片一级黄色片| 亚洲欧美精品综合一区二区三区| 极品教师在线免费播放| 国产激情欧美一区二区| 国产av一区二区精品久久| 美女 人体艺术 gogo| 又紧又爽又黄一区二区| 精品免费久久久久久久清纯| 99久久99久久久精品蜜桃| 国产欧美日韩一区二区三| e午夜精品久久久久久久| 亚洲电影在线观看av| 男女之事视频高清在线观看| 亚洲av片天天在线观看| 99国产精品免费福利视频| 少妇的丰满在线观看| 99久久久亚洲精品蜜臀av| av网站免费在线观看视频| 色播在线永久视频| 国产成年人精品一区二区| 欧美丝袜亚洲另类 | 99香蕉大伊视频| 欧美日韩精品网址| 亚洲精品中文字幕一二三四区| 一级毛片精品| www.999成人在线观看| 婷婷丁香在线五月| 午夜影院日韩av| 男人的好看免费观看在线视频 | 91精品国产国语对白视频| 日本 av在线| 操美女的视频在线观看| 老熟妇仑乱视频hdxx| 中亚洲国语对白在线视频| 在线国产一区二区在线| 国产一区二区在线av高清观看| 国产av一区在线观看免费| 亚洲精品在线观看二区| 18禁黄网站禁片午夜丰满| 欧美成人午夜精品| 男女床上黄色一级片免费看| 欧美乱妇无乱码| 久久人人爽av亚洲精品天堂| 手机成人av网站| 少妇被粗大的猛进出69影院| 如日韩欧美国产精品一区二区三区| 国产片内射在线| 亚洲精品一卡2卡三卡4卡5卡| 久久亚洲精品不卡| 国产精品久久视频播放| 日本在线视频免费播放| 熟妇人妻久久中文字幕3abv| av天堂在线播放| 久久精品国产亚洲av高清一级| 午夜成年电影在线免费观看| 精品国产美女av久久久久小说| av超薄肉色丝袜交足视频| 黑人欧美特级aaaaaa片| 999久久久国产精品视频| 免费女性裸体啪啪无遮挡网站| 欧洲精品卡2卡3卡4卡5卡区| 在线观看www视频免费| 欧美日韩中文字幕国产精品一区二区三区 | 琪琪午夜伦伦电影理论片6080| 精品熟女少妇八av免费久了| 变态另类成人亚洲欧美熟女 | 欧美成人午夜精品| 1024视频免费在线观看| 欧美国产精品va在线观看不卡| av有码第一页| 国产成人免费无遮挡视频| 国产精品,欧美在线| 成人三级做爰电影| 国产真人三级小视频在线观看| 亚洲欧美激情综合另类| 身体一侧抽搐| 十八禁人妻一区二区| 亚洲av五月六月丁香网| 欧美在线一区亚洲| 亚洲av五月六月丁香网| 免费人成视频x8x8入口观看| 久久精品亚洲熟妇少妇任你| 久久人妻熟女aⅴ| 在线国产一区二区在线| 91成年电影在线观看| 啦啦啦韩国在线观看视频| 久久中文看片网| 精品久久久久久成人av| 亚洲精品久久成人aⅴ小说| www国产在线视频色| 自线自在国产av| 久久久国产成人免费| 日本黄色视频三级网站网址| 一区二区三区精品91| 亚洲欧美精品综合一区二区三区| 啪啪无遮挡十八禁网站| 亚洲色图av天堂| 欧美日韩乱码在线| 久久精品亚洲熟妇少妇任你| 国内毛片毛片毛片毛片毛片| 搡老熟女国产l中国老女人| 久久精品人人爽人人爽视色| 波多野结衣巨乳人妻| 十八禁网站免费在线| 国产片内射在线| 男女下面进入的视频免费午夜 | 天堂影院成人在线观看| 国产私拍福利视频在线观看| 天堂影院成人在线观看| 国产精品免费一区二区三区在线| 午夜久久久久精精品| av视频免费观看在线观看| 色av中文字幕| 日韩 欧美 亚洲 中文字幕| 亚洲成国产人片在线观看| 此物有八面人人有两片| 亚洲成a人片在线一区二区| 日韩精品青青久久久久久| 可以在线观看的亚洲视频| 国产一级毛片七仙女欲春2 | 日日夜夜操网爽| 欧美黑人欧美精品刺激| 久久久久久久久免费视频了| 99精品欧美一区二区三区四区| 日韩欧美国产在线观看| 亚洲男人天堂网一区| 九色国产91popny在线| 精品国产美女av久久久久小说| 久久伊人香网站| 国产又色又爽无遮挡免费看| 久久伊人香网站| 男女做爰动态图高潮gif福利片 | 美女午夜性视频免费| 中出人妻视频一区二区| 久久国产精品男人的天堂亚洲| 国产亚洲欧美98| 国产野战对白在线观看| 久久久国产成人精品二区| 最好的美女福利视频网| 老司机午夜福利在线观看视频| 波多野结衣av一区二区av| 首页视频小说图片口味搜索| 美女午夜性视频免费| 精品日产1卡2卡| 午夜福利欧美成人| www.熟女人妻精品国产| 午夜福利,免费看| 级片在线观看| 欧美激情 高清一区二区三区| 成人18禁高潮啪啪吃奶动态图| 国产精品久久久人人做人人爽| 成人18禁在线播放| 亚洲激情在线av| 国产人伦9x9x在线观看| 亚洲成av片中文字幕在线观看| 夜夜躁狠狠躁天天躁| 88av欧美| 久久国产亚洲av麻豆专区| 欧美日韩中文字幕国产精品一区二区三区 | 久久精品91蜜桃| 91国产中文字幕| 国产一级毛片七仙女欲春2 | 成人精品一区二区免费| 国产精品精品国产色婷婷| av电影中文网址| 性色av乱码一区二区三区2| 免费在线观看视频国产中文字幕亚洲| 色播亚洲综合网| 变态另类成人亚洲欧美熟女 | 亚洲av成人不卡在线观看播放网| 首页视频小说图片口味搜索| 国产精品香港三级国产av潘金莲| 天堂影院成人在线观看| 免费高清视频大片| 精品国产国语对白av| 免费看美女性在线毛片视频| 曰老女人黄片| 黄片大片在线免费观看| 欧美一区二区精品小视频在线| 日韩有码中文字幕| 色在线成人网| 亚洲国产中文字幕在线视频| 精品乱码久久久久久99久播| 免费一级毛片在线播放高清视频 | 777久久人妻少妇嫩草av网站| 国产精品久久视频播放| 国产一区二区三区视频了| 99精品久久久久人妻精品| 韩国精品一区二区三区| 亚洲成国产人片在线观看| 精品国产一区二区三区四区第35| 日韩欧美在线二视频| 国产91精品成人一区二区三区| 精品第一国产精品| 女生性感内裤真人,穿戴方法视频| 热re99久久国产66热| 大陆偷拍与自拍| 欧美国产精品va在线观看不卡| 久久亚洲真实| 亚洲精品国产区一区二| 很黄的视频免费| 欧美av亚洲av综合av国产av| 久久伊人香网站| 99热只有精品国产| 国产97色在线日韩免费| 97人妻精品一区二区三区麻豆 | 亚洲在线自拍视频| 亚洲精品国产精品久久久不卡| netflix在线观看网站| 黄频高清免费视频| 侵犯人妻中文字幕一二三四区| 高清毛片免费观看视频网站| 每晚都被弄得嗷嗷叫到高潮| 很黄的视频免费| 亚洲第一av免费看| 级片在线观看| 女生性感内裤真人,穿戴方法视频| 国产亚洲精品av在线| 搡老妇女老女人老熟妇| 亚洲欧美精品综合一区二区三区| 国产欧美日韩一区二区精品| 91成人精品电影| 精品无人区乱码1区二区| 精品电影一区二区在线| 亚洲精品久久国产高清桃花| 不卡av一区二区三区| 黄频高清免费视频| 99国产精品一区二区三区| 欧美激情高清一区二区三区| 国产精品电影一区二区三区| 国产精品一区二区三区四区久久 | 亚洲精品国产一区二区精华液| 国产高清视频在线播放一区| 免费人成视频x8x8入口观看| 咕卡用的链子| www.999成人在线观看| 国产精品永久免费网站| 久久精品影院6| 亚洲国产欧美网| av视频在线观看入口| 国产视频一区二区在线看| 中文字幕人妻熟女乱码| 成人三级黄色视频| 制服人妻中文乱码| 国产精品九九99| av电影中文网址| 一级,二级,三级黄色视频| 亚洲中文字幕一区二区三区有码在线看 | 一边摸一边抽搐一进一出视频| 在线观看午夜福利视频| 自线自在国产av| 自拍欧美九色日韩亚洲蝌蚪91| 两个人免费观看高清视频| 免费观看精品视频网站| 欧美日韩乱码在线| 午夜两性在线视频| 很黄的视频免费| 午夜a级毛片| 亚洲欧美日韩高清在线视频| 精品国产乱码久久久久久男人| 9热在线视频观看99| 男人的好看免费观看在线视频 | 亚洲精品在线美女| 操美女的视频在线观看| 免费av毛片视频| 免费一级毛片在线播放高清视频 | 久久久久久免费高清国产稀缺| 久久久精品国产亚洲av高清涩受| 亚洲自拍偷在线| 国产单亲对白刺激| 一区二区三区激情视频| 老司机靠b影院| 成人手机av| 麻豆久久精品国产亚洲av| 亚洲精品久久国产高清桃花| 国产精品99久久99久久久不卡| 久久精品成人免费网站| 在线观看66精品国产| 夜夜爽天天搞| 亚洲专区国产一区二区| www.www免费av| 亚洲欧美日韩另类电影网站| 午夜福利在线观看吧| 久热爱精品视频在线9| av超薄肉色丝袜交足视频| 成年版毛片免费区| 日本 欧美在线| ponron亚洲| 亚洲av电影不卡..在线观看| 亚洲熟妇熟女久久| 欧美日韩瑟瑟在线播放| 午夜精品久久久久久毛片777| 亚洲国产精品sss在线观看| 久久热在线av| 丝袜美腿诱惑在线| 亚洲电影在线观看av| 夜夜夜夜夜久久久久| 1024香蕉在线观看| 国产熟女xx| 亚洲人成电影观看| 亚洲自拍偷在线| 亚洲情色 制服丝袜| 黄色视频,在线免费观看| 精品不卡国产一区二区三区| 免费不卡黄色视频| 久久中文看片网| 亚洲一区二区三区不卡视频| 国产精华一区二区三区| 亚洲欧美日韩无卡精品| 我的亚洲天堂| 国产高清有码在线观看视频 | 首页视频小说图片口味搜索| 国产成年人精品一区二区| 无限看片的www在线观看| 禁无遮挡网站| 国产在线观看jvid| 我的亚洲天堂| av视频在线观看入口| 成人av一区二区三区在线看| 亚洲男人的天堂狠狠| 国产精品日韩av在线免费观看 | 在线观看免费午夜福利视频| 男人舔女人下体高潮全视频| 久久国产亚洲av麻豆专区| 电影成人av| 波多野结衣高清无吗| 亚洲天堂国产精品一区在线| 久久人妻福利社区极品人妻图片| 国产99白浆流出| 91麻豆av在线| 国产一区二区激情短视频| 国语自产精品视频在线第100页| 人人妻人人澡人人看| 在线永久观看黄色视频| 国产熟女xx| 欧美乱妇无乱码| 国产一区在线观看成人免费| 久久国产亚洲av麻豆专区| 18美女黄网站色大片免费观看| av中文乱码字幕在线| 制服人妻中文乱码| 免费女性裸体啪啪无遮挡网站| 老汉色∧v一级毛片| 亚洲欧美精品综合久久99| 欧美激情极品国产一区二区三区| 欧美日韩精品网址| 日韩一卡2卡3卡4卡2021年| 在线免费观看的www视频| 天天一区二区日本电影三级 | 国产真人三级小视频在线观看| 中文字幕av电影在线播放| 一级,二级,三级黄色视频| 亚洲精品一区av在线观看| 黄色a级毛片大全视频| 日韩av在线大香蕉| 人人妻,人人澡人人爽秒播| xxx96com| 亚洲aⅴ乱码一区二区在线播放 | 国产亚洲精品久久久久久毛片| 女人被狂操c到高潮| 日韩三级视频一区二区三区| 一级a爱视频在线免费观看| 夜夜夜夜夜久久久久| 一二三四社区在线视频社区8| 午夜激情av网站| av片东京热男人的天堂| 长腿黑丝高跟| 亚洲欧美精品综合一区二区三区| 国产亚洲欧美98| АⅤ资源中文在线天堂| 久久国产精品影院| 少妇的丰满在线观看| 国产亚洲欧美在线一区二区| 中文亚洲av片在线观看爽| 99久久综合精品五月天人人| 久久精品人人爽人人爽视色| 久久精品91蜜桃| 精品久久久久久成人av| 极品教师在线免费播放| 每晚都被弄得嗷嗷叫到高潮| 高清在线国产一区| 51午夜福利影视在线观看| 自线自在国产av| 9色porny在线观看| 亚洲一码二码三码区别大吗| 欧美+亚洲+日韩+国产| 黄片播放在线免费| 久久久久久久久久久久大奶|