• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Deep Learning and Entity Embedding-Based Intrusion Detection Model for Wireless Sensor Networks

    2021-12-10 11:57:52BandarAlmaslukh
    Computers Materials&Continua 2021年10期

    Bandar Almaslukh

    Department of Computer Science,College of Computer Engineering and Sciences,Prince Sattam bin Abdulaziz University,Al-Kharj,11942,Saudi Arabia

    Abstract:Wireless sensor networks(WSNs)are considered promising for applications such as military surveillance and healthcare.The security of these networks must be ensured in order to have reliable applications.Securing such networks requires more attention,as they typically implement no dedicated security appliance.In addition,the sensors have limited computing resources and power and storage,which makes WSNs vulnerable to various attacks,especially denial of service(DoS).The main types of DoS attacks against WSNs are blackhole,grayhole,flooding,and scheduling.There are two primary techniques to build an intrusion detection system(IDS):signature-based and data-driven-based.This study uses the data-driven approach since the signature-based method fails to detect a zero-day attack.Several publications have proposed data-driven approaches to protect WSNs against such attacks.These approaches are based on either the traditional machine learning(ML)method or a deep learning model.The fundamental limitations of these methods include the use of raw features to build an intrusion detection model,which can result in low detection accuracy.This study implements entity embedding to transform the raw features to a more robust representation that can enable more precise detection and demonstrates how the proposed method can outperform state-of-the-art solutions in terms of recognition accuracy.

    Keywords:Wireless sensor networks;intrusion detection;deep learning;entity embedding;artificial neural networks

    1 Introduction

    With the variety of applications of wireless sensor networks(WSNs)in areas such as environmental observation,smart cities,military surveillance,and healthcare[1],there have been an increasing number of security breaches in such networks.A primary factor is limited computing resources such as processing,storage,memory,and power,enabling such denial of service(DoS)attacks as blackhole,scheduling,grayhole,and flooding.DoS attack detection is considered a machine learning task,i.e.,a supervised classification problem.Such methods include traditional machine learning and deep learning.

    Deep learning often outperforms traditional machine learning on unstructured data problems such as image and audio recognition and natural language processing,and is increasingly regarded as the ideal solution for such data.This has led to the question of whether these methods can be similarly successful with structured(tabular)data,such as is used in this study.Structured data are organized in a tabular format whose columns and rows represent features and data points,respectively.Current state-of-the-art solutions for structured data are ensemble methods such as random forest[2]and gradient boosted tree[3],and deep learning with entity embedding[4]has performed at least as well on tabular data.

    The present study proposes an end-to-end ANN architecture with entity embedding that outperforms recent state-of-the-art intrusion detection models for WSNs on a public WSN-DS dataset[5].The proposed model does not depend on a complex ensemble model and handcrafted feature engineering.It considers features as categorical variables and maps them into m-dimensional vectors of continuous variables that transform a raw feature in a more robust representation.The mapping is automatically learned through supervised learning of standard neural networks.

    The remainder of this paper is organized as follows.Section 2 examines related studies.Section 3 explains materials and methods used in the proposed solution.Section 4 discusses experiments using the proposed model on the WSN-DS dataset.Section 5 presents conclusions and suggested future research.

    2 Related Work

    Applications of WSNs include recognizing climate changes,monitoring habitats and environments,and other surveillance applications[6].Several security-based methods for WSNs have been proposed,including key exchange protocol,authentication,secure routing protocol,and security solutions targeting a specific type of attack[7,8].The intrusion detection system(IDS)is a promising solution to identify and defend against WSN attacks[9].When detecting an attack,the IDS raises an alarm to notify the controller to take action[9,10].However,WSNs are exposed to numerous security threats,such as DoS,which can degrade their performance.DoS attacks can target the protocol stack,or different layers of sensors[11–13].To detect attacks,network traffic is examined,and an appropriate security mechanism is used[14].

    IDSs use two main approaches.Rule- or signature-based IDS[15]can accurately detect wellknown attacks,but not a zero-day attack whose signatures do not exist in intrusion signature data.Anomaly-based(data-driven)IDSs detect intrusions by matching abnormal resource utilization and/or traffic patterns against normal behavior.Although anomaly-based IDSs can detect both well-known and zero-day attacks,they have more false positives and false negatives[15].The present study lists related work using anomaly-based IDSs for WSNs,as they are more closely related to the proposed method.Current IDS methods for WSNs include genetic algorithms and fuzzy rule-based modeling[16],support vector machine(SVM)[17],artificial neural networks[18],Bayesian networks and random tree[19],and random forest[20].Their limitations were evaluated on the KDD dataset[21]that was collected for computer networks such as local area networks(LANs)and wide area networks(WANs),but not for WSNs with special characteristics.Some work has used self-made WSN datasets[22–24]that are not publicly available.We used the publicly available WSN-DS dataset[5].

    Some work has been evaluated on the WSN-DS dataset[5],which was collected specifically for WSNs.An artificial neural network was used[5]to build the classifier;feature selection using the information gain ratio and a bagging algorithm was proposed[25];and naive Bayes,SVM,decision tree,k-nearest neighbor(KNN),and a convolutional neural network(CNN)were utilized[26].The limitation of these methods is that they use raw features directly.This can be addressed by transforming raw features to a more robust representation using entity embedding.

    We used entity embedding to develop a more accurate classification,and experimentally demonstrated how the proposed method can outperform state-of-the-art solutions.

    3 Materials and Methods

    This section has six subsections.Subsection 3.1 describes the dataset.Subsection 3.2 explains the artificial neural network(ANN)model primarily used in this work.Subsection 3.3 examines the entity embedding of categorical variables[4],the essential technique used in this study.Subsection 3.4 illustrates the proposed architecture.Subsection 3.5 explains the evaluation of the model.Subsection 3.6 presents hyperparameter selection strategies.

    3.1 Dataset

    We used the WSN-DS simulated dataset[5]to build and evaluate the proposed model.The dataset was created to recognize several types of DoS attacks as well as normal traffic.It contains four DoS attack types:blackhole,grayhole,flooding,and scheduling,with 17 features,describing the state of each sensor in WSN,extracted from the Low Energy Aware Cluster Hierarchy(LEACH)routing protocol.These and the target variables(label)are shown in Tab.1.Fig.1 shows the data distribution over normal and different types of attacks from the WSN-DS dataset.It should be noted that such extreme imbalances in the class distribution require a specific type of metric to evaluate model performance,as the accuracy is dominated by the majority class(normal).

    3.2 Artificial Neural Network

    We explore the ANN,the main model used in this paper.The multilayer perceptron(MLP)is the primary ANN architecture[27].As shown in Fig.2,MLP has three types of layers:oneinput layer,one/more-hidden layer,and one output layer.As these layers are densely connected,many weights(parameters)must be configured.These are fine-tuned during the backpropagation algorithm[28],as shown in Algorithm 1.The input of the algorithm is a set of input-output(Xi,yi)pairs,called training examples.The algorithm outputs the optimized network weights.

    The goal of backpropagation is to fine-tune a network’s weights through a sequence of forward and backward iterations to minimize the cost function until the error stops improving or the maximum number of epochs is reached.The algorithm first initializes the weights of the network using a statistical initializer.

    A forward step finds the network output for randomly selected training examplese=(x,y)by finding each layer output,starting at the input layer and moving toward the output layer.The output of each layer(say layerk)is computed by finding the output of each neuron in the layer.The output of neuronsjin layerkis calculated as

    Table 1:List of features with abbreviation,description,and number of levels

    Figure 1:Data distribution over normal and different types of attacks deduced from WSN-DS dataset

    Figure 2:Basic ANN architecture[29]

    whereajis the output for neuronjin layerk;nis the number of neurons in the previous layer(k?1);a1...anare the outputs of neurons in the previous layer(k?1);wi,jis the weight(parameter)from neuronjand the corresponding neuronsiin the previous layer(k?1);andgis the activation function.

    Algorithm 1:Backpropagation 1:Input:network,training set(x,y)2:Output:optimized network weights 3:Initialize weights(typically random)4:while the error stops improving or reach max epochs do 5:Randomly pick e from training data 6:// forward pass 7:Output= FindNetworkOutput(network,e.x)8:Error= e.y – output 9:OuputGradient = ComputeOutputGradients(e.x,Error)10:// backward pass 11:for each hidden layer 12:Error = ComputeHiddenError(Error)13:HiddenGradient = ComputeHiddenGradients(e.x,Error)14:UpdateNetworkWeights(OuputGradien,HiddenGradient)15:end while 16: List of Functions:17:function FindNetworkOutput(network,input)18:output =[]19:for each layer 20:input = FindLayerOutput(layer,input)21:output = input 22:return output 23:end function 24:function FindLayerOutput(layer,input)25:Output =[]

    26:for each neuron in the layer 27:output[i]= FindNeuronOutput(Neuron[i],input)28:return output 29:end function 30:function FindNeuronOutput(neuron,input)31:output = 0 32:W = getNeuronWeights()33:X = DotProduct(input,W)34:output = ActivationFunction(X)35:return output 36:end function

    After applying the forward step,we gethw(e.x),the output vector generated by the network when given the input vectore.x.Therefore,the square error(error)can be computed by subtracting the correct outpute.yfrom the predicted value of the networkhw(e.x),

    Then the gradient of error is calculated by computing the partial derivative of error with respect to each weight.

    The second step is backward,and calculation is performed from the output layer toward the first hidden layer.The gradient anderrorare computed using the chain rule[28]for each hidden layer.All network weights are updated according to the gradient of the output and all hidden layers.

    3.3 Entity Embedding of Categorical Variables

    Entity embedding of categorical variables[4]is the primary technique used in this paper,which maps them into m-dimensional vectors of continuous variables.Supervised learning of a standard neural network is used to learn the mapping,as explained previously.

    As an example,assume that entity embedding is applied over a day of the week as a categorical variable of the neural network model.This leads to the development of aK×mmatrix to map the day of the week to an m-dimensional vector.Kis the number of distinct levels of the categorical variable(Monday,Tuesday,etc.),andmis the desired mapping dimension,which is considered a hyperparameter with a 2:K?1 range.In this example,Kis 7(days of the week),and we assumemis 4.To find the 4-dimensional mapping of specific levels(days),1×7 one-hot encoding of Sunday is multiplied by the mapping matrix of 7×4.The mapping matrix values can be learned using supervised learning of standard neural networks,adding an embedding layer on top of each one-hot encoding input feature,as shown in Fig.3.The matrix of weights connects the one-hot encoding layer,and the embedding layer is the mapping matrix.The weights of this matrix are learned during training of the neural network by backpropagation.It is clear in Fig.3 that the input features are represented as one-hot encoding;then the weights(embedding matrix)that connect the one-hot encoding layer with the embedding layer are used to extract m-dimensional vectors of continuous variables.This continuous representation places categorical values with related effects near each other in the feature vector space.

    For the above example,the one-hot encoding input layer includes seven neurons that indicate the number of levels of the categorical variable,and the embedding layer has four neurons,corresponding to the desired mapping dimension,as shown in Fig.3.

    Figure 3:Proposed model architecture

    3.4 Proposed Model

    In this work,a multi-layer ANN-based architecture is proposed for a WSN intrusion detection system,as shown in Fig.4.Tab.2 provides details about the notation.Multi-layer ANNs(deep model)enable us to effectively learn nonlinear decision boundaries for more accurate classification.Entity embedding automatically learns a robust representation of the raw features with the help of the deep model.This work shows how to determine the optimal architecture(number of hidden layers,number of neurons in each layer,and embedding size)and investigate the optimal value for other model hyperparameters,such as the activation function and learning rate.

    Figure 4:Proposed model architecture

    As most input features for WSN intrusion detection have a limited number of levels,as shown in Tab.1,we treat them as categorical variables.It can be noticed that the proposed architecture represents most input features using the concept of entity embedding explained above.Entity is not used directly for some input features(Who_CH,Dist_To_CH,and Consumed Energy)because the large number of unique values(# of levels)is very large,which requires the learning of a large number of parameters.

    Table 2:Notation used in Fig.4

    As most input features for WSN intrusion detection have a limited number of levels,as shown in Tab.1,we treat these as categorical variables.The proposed architecture represents the majority of the input features using the concept of entity embedding explained above.Entity embedding is not used directly for some input features(Who_CH,Dist_To_CH,and Consumed Energy)with a very large number of unique values(# of levels),which require a large number of parameters to be learned.For example,Dist_To_CH features have 13956 values,which need 139560 weights to learn an embedding vector of size 10.However,when we used entity embedding for these values,we realized that the model was over-fit.Therefore,we performed discretization(binning)on these features before passing them to the model.Discretization is accomplished by decomposing each feature into a set of bins.In the present study,we use 300,100,and 100 bins for Consumed Energy,Dist_To_CH,and Who_CH,respectively.

    Then,in the first layer,each categorical input feature is transformed to a one-hot representation.The embedding layer of size 10 is added on top of it.This layer is used to learn the embedding matrix that maps each input feature to a 10-dimensional continuous vector.This layer helps place values with similar effects near each other in the vector space,thus exposing the fundamental continuity of the data to ensure a robust representation of the input features.

    Next,all learned embedding vectors and the binary features(which only include two levels)that are not required for learning its embedding are concatenated together.

    Then,two fully-connected layers(256–256 neurons)are added on top of the concatenated layer and used to learn the classification weights.A softmax layer of five neurons is added on top of these layers,which helps compute the probability distribution over the attack labels.Intuition regarding the selected values of the hyperparameters(# of layers,# of neurons in each layer,embedding size,learning rate,and activation function)is related in Subsection 3.6.

    Finally,the training data are fed into the network,whose parameters(weights)are optimized using an adjusted version of stochastic gradient descent(Adam)and a backpropagation algorithm.Algorithm 2 presents the steps used to train the proposed architecture.

    Algorithm 2:Proposed model 1:Input:WSN-DS 2:Output:optimized intrusion detection model 3:Preprocessing phase:4:Discretize Who_CH,Dist_To_CH,and Consumed Energy features 5:Transform the input features to one-hot encoding 6:Divided the dataset into training and testing sets 7:Training phase:8://Model Hyperparameters:9:num_epochs =100 10:batch_size= 4096 11:optimizer= Adam 12 loss= categorical_crossentropy 13:learning rate= 0.001 14:Network= Initialize the network architecture as in Fig.4 with random weights 15:for i=1 to num_epochs do 16:Sample randomly from the training set batch_size data points 17:Apply backpropagation(Algorithm 1)on network to fit the training data 18:// backpropagation with Adam optimizer(lr=0.001)and categorical_crossentropy loss function.19:Testing phase:20:for i=1 to size_of_testing_set do 21:Perform forward propagation FindNetworkOutput(Network,input(i.x))22:Return the class label with the highest probability

    3.5 Model Evaluation

    Different measures are available to compute evaluation scores for the classification problem.For intrusion detection systems,significant metrics include detection rate,recall,true positive rate(TPR),precision,false alarm rate/false positive rate(FPR),and F-score,defined as follows:

    where TP,FP,FN,and TN are the numbers of true positives,false positives,false negatives,and true negatives,respectively.

    In terms of a multi-class classification problem,the evaluation measures for all classes can be averaged as either a micro- or macro-average to obtain the general performance of a model.A micro-average is dominated by the majority class,as it treats each data point(instance)equally,whereas a macro-average gives equal weight to each class.We study an intrusion detection dataset with significantly more data points of a normal class than of abnormal classes.We use macroaveraging to prevent the majority(normal)class from dominating the performance of the model.To accelerate decision-making during model optimization and compare the proposed model to state-of-the-art methods,we use a macro-average F-score.In addition,a micro-average F-score is calculated to compare certain related work that uses a micro-average as a metric.Metrics such as recall,precision,and false-alarm rate are also used to obtain detailed information about the model’s performance.

    3.6 Hyperparameter Selection

    The proposed ANN architecture has several hyperparameters that must be selected carefully,as they impact the model’s performance.A grid search technique is implemented to select the optimal combination of hyperparameters.We adjust the embedding size,number of hidden layers,and number of neurons in each layer,activation function,and learning rate.

    Embedding size:Fig.5 illustrates the dependency between different embedding sizes and the macro-avg.F-score.It can be observed that the performance was slightly improved with an increase in the embedding size,reached a peak when the embedding size was between 10 and 20,declined slightly when the embedding size was greater than 20,and was consistent until the embedding size reached 120,when it started to diminish as the embedding size increased,as the model was over-fit with a large number of weights(parameters).

    Figure 5:Dependency between embedding size and macro-avg.F-score

    Number of hidden layers and number of neurons in each layer:It is important to tune these two hyperparameters concurrently to determine the best combination.Initially,only a single hidden layer spanning a range of(64,128,256,512,1025)neurons was used.Another layer was added,and the same range was applied to both layers.The process was repeated for a third layer,and it was concluded that the best macro-average F-score is achieved when there are two hidden layers,each with 256 neurons.

    Activation Function:Fig.6 shows the dependency between the type of activation function and the macro-average F-score.These functions are applied for each neuron so that the model can distinguish between the nonlinear decision boundaries of the data.Three types of activation functions,ReLU,hyperbolic tangent(tanh),and sigmoid,are usually used in ANN:

    The best performance was achieved with the ReLU activation function,which is computationally more efficient,as it uses only a threshold as an input value.

    Figure 6:Dependency between activation function type and macro-avg.F-score

    Learning rate:The learning rate was altered over a range of(0.0001–0.1024),with each time increment calculated by multiplying the previous value by 2.It was determined that the best macro-average F-score can be achieved when the learning rate is between 0.0008 and 0.0016.This range was further examined,and the best performance was observed to occur when the learning rate was 0.001.

    Tab.3 summarizes the tuned hyperparameters used in this work.

    4 Experiments

    Subsection 4.1 presents the experimental settings used in this study.Subsection 4.2 shows the effectiveness of the proposed feature representation using entity embedding.Subsection 4.3 compares the proposed model to state-of-the-art solutions.

    4.1 Experimental Settings

    To ensure precision in comparisons,settings for comparative methods were those of the work in which they were proposed.The holdout technique was used to split the dataset into 60%training and 40% testing,with stratified sampling to ensure a consistent ratio of classes.Fig.7 shows the data distribution over normal and different types of attacks for the training and test sets.

    Table 3:Summary of hyperparameters used in this work

    Figure 7:Data distribution over normal and different types of attacks for training and test sets

    4.2 Effectiveness of Proposed Feature Representation Method

    To demonstrate the power of the proposed feature representation method,its performance was compared using two approaches.

    In the first approach,the feature representation methods were varied and the network architecture was unchanged.Tab.4 compares the performance of the proposed feature representation,ordinal encoding(original feature values),and binary encoding.It is seen that the proposed feature representation outperforms other types in the macro F-score.However,binary encoding delivers performance comparable to that of the proposed method,and outperforms the state-ofthe-art methods,as shown in Subsection 4.3.We believe that binary encoding performs better,as it works well with categorical features with high cardinality(# of levels).

    In the second approach,the proposed method was evaluated using varying network architectures to demonstrate its effectiveness.

    Table 4:Micro- and macro-avg.F-score of proposed WSN intrusion detection approaches using WSN-DS dataset

    Tab.5 presents the performance of the proposed feature representation through several network architectures.It is evident that the performance changes little as layers are added,or the number of neurons in each layer increased.This stability in performance reflects the effectiveness of the method.While 256–256 and 512–256 achieved similar performance,we selected 256–256,as it was more computationally efficient.

    Table 5:Micro- and macro-avg.F-score of proposed feature representation with several network architectures

    4.3 Comparison with State-of-the-Art

    We show how the proposed model can outperform the state-of-the-art in terms of micro and macro F-score.It can be observed that the proposed method significantly outperforms all in terms of macro F-score,and slightly outperforms in terms of micro F-score.The slight difference in the micro-average is due to the domination of its result by the majority class(normal).Fig.7 presents the extreme difference in the number of instances between the normal class and other abnormal classes.

    Table 6:Micro- and macro-avg.F-score of proposed WSN intrusion detection approaches using WSN-DS dataset

    It is evident from Tab.6 that the methods used in[5]and[25]had the highest accuracy,and the results of their solutions and that of the proposed method are given in Tabs.7–9,respectively.The proposed method delivers better performance in terms of recall,precision,and FPR,as shown in Tabs.6–8.

    Table 7:Recall,FPR,and precision of work in[5]

    Table 8:Recall,FPR,and precision of work in[25]

    Table 9:Recall,FPR,and precision of proposed model

    5 Conclusion and Future Work

    We proposed a deep learning architecture that uses ANNs with categorical entity embedding,and sought to demonstrate that it can produce an effective intrusion detection system for WSNs.Entity embedding was proven able to create a robust representation for raw features that can lead to better performance compared to the state-of-the-art.In future work,entity embedding representation will be used with ensemble classification instead of classical ANN to take advantage of the classification power of ensemble methods such as random forest and gradient boosted tree.This work was limited to a single dataset because of a lack of publicly available datasets collected for WSN DoS attacks.In future work,we intend to obtain another dataset to detect different DoS attacks.

    Acknowledgement:This publication was supported by the Deanship of Scientific Research at Prince Sattam bin Abdulaziz University.

    Funding Statement:The author received no specific funding for this study.

    Conflicts of Interest:The author declares that he has no conflicts of interest to report regarding the present study.

    成人精品一区二区免费| 怎么达到女性高潮| 两性午夜刺激爽爽歪歪视频在线观看 | 国产免费现黄频在线看| 久久香蕉激情| 精品国内亚洲2022精品成人 | 免费在线观看影片大全网站| 亚洲第一青青草原| 这个男人来自地球电影免费观看| www.精华液| 夜夜骑夜夜射夜夜干| 好男人电影高清在线观看| www.999成人在线观看| 色精品久久人妻99蜜桃| 91大片在线观看| 亚洲一区中文字幕在线| 国产高清激情床上av| 视频区欧美日本亚洲| 制服诱惑二区| 国产精品免费大片| 亚洲一区中文字幕在线| 天天影视国产精品| 叶爱在线成人免费视频播放| 成人国产av品久久久| 老汉色av国产亚洲站长工具| 精品熟女少妇八av免费久了| 亚洲色图综合在线观看| 999久久久国产精品视频| 深夜精品福利| 亚洲熟女精品中文字幕| 亚洲 欧美一区二区三区| 欧美日韩黄片免| 国产不卡av网站在线观看| av超薄肉色丝袜交足视频| 777米奇影视久久| 一本综合久久免费| 日本欧美视频一区| av超薄肉色丝袜交足视频| 欧美另类亚洲清纯唯美| 亚洲欧美精品综合一区二区三区| 久久精品aⅴ一区二区三区四区| 国产免费福利视频在线观看| 亚洲精品国产一区二区精华液| 老司机亚洲免费影院| 一个人免费看片子| 夜夜爽天天搞| 国产在线精品亚洲第一网站| 亚洲国产欧美日韩在线播放| av有码第一页| 99久久精品国产亚洲精品| 热re99久久国产66热| 999久久久国产精品视频| 国产一区二区激情短视频| 欧美人与性动交α欧美精品济南到| 50天的宝宝边吃奶边哭怎么回事| 好男人电影高清在线观看| 性高湖久久久久久久久免费观看| 桃花免费在线播放| 老熟妇乱子伦视频在线观看| 亚洲视频免费观看视频| 最黄视频免费看| 啦啦啦在线免费观看视频4| 老汉色∧v一级毛片| 99re6热这里在线精品视频| 久久精品成人免费网站| 色视频在线一区二区三区| 女人爽到高潮嗷嗷叫在线视频| 麻豆乱淫一区二区| 考比视频在线观看| 丝袜美腿诱惑在线| 18禁裸乳无遮挡动漫免费视频| 久久精品亚洲精品国产色婷小说| 亚洲国产看品久久| 亚洲人成电影观看| 免费一级毛片在线播放高清视频 | 午夜激情久久久久久久| 国产精品亚洲一级av第二区| 久久久水蜜桃国产精品网| 男人操女人黄网站| 精品熟女少妇八av免费久了| 久久久精品94久久精品| 国产无遮挡羞羞视频在线观看| 高清视频免费观看一区二区| 无人区码免费观看不卡 | 欧美在线一区亚洲| 欧美精品av麻豆av| 久久久久久人人人人人| 国内毛片毛片毛片毛片毛片| 法律面前人人平等表现在哪些方面| 一级a爱视频在线免费观看| 日韩欧美一区二区三区在线观看 | 日本wwww免费看| 色播在线永久视频| 免费黄频网站在线观看国产| 人成视频在线观看免费观看| 国产在线精品亚洲第一网站| 亚洲少妇的诱惑av| 一级片免费观看大全| 男女高潮啪啪啪动态图| 天天躁夜夜躁狠狠躁躁| 国产有黄有色有爽视频| 多毛熟女@视频| 在线观看66精品国产| 国产欧美日韩一区二区三| 高清毛片免费观看视频网站 | 三级毛片av免费| 精品高清国产在线一区| 精品国产一区二区三区四区第35| 91国产中文字幕| 亚洲精品av麻豆狂野| 嫩草影视91久久| 丝袜美腿诱惑在线| av天堂久久9| 热re99久久精品国产66热6| 国产一区二区三区综合在线观看| 国产精品亚洲av一区麻豆| 亚洲va日本ⅴa欧美va伊人久久| 精品视频人人做人人爽| 这个男人来自地球电影免费观看| 在线观看免费高清a一片| 亚洲少妇的诱惑av| 国产精品 国内视频| 欧美大码av| 久久久久久久大尺度免费视频| 日韩熟女老妇一区二区性免费视频| 亚洲伊人久久精品综合| 女同久久另类99精品国产91| 国产在线视频一区二区| 亚洲免费av在线视频| 热99国产精品久久久久久7| 两个人免费观看高清视频| 欧美日韩成人在线一区二区| 女人爽到高潮嗷嗷叫在线视频| 99re在线观看精品视频| 99香蕉大伊视频| 王馨瑶露胸无遮挡在线观看| 在线观看一区二区三区激情| 中文欧美无线码| 一级毛片精品| 久热这里只有精品99| 亚洲五月婷婷丁香| 中文字幕精品免费在线观看视频| 精品一品国产午夜福利视频| 午夜精品久久久久久毛片777| 黄色毛片三级朝国网站| 久久精品aⅴ一区二区三区四区| 国产色视频综合| 久久国产亚洲av麻豆专区| 男女午夜视频在线观看| 一级,二级,三级黄色视频| 精品少妇久久久久久888优播| 老熟女久久久| 1024视频免费在线观看| 99久久99久久久精品蜜桃| 久久国产精品男人的天堂亚洲| 中亚洲国语对白在线视频| 久久人人爽av亚洲精品天堂| 午夜免费成人在线视频| 久久久久久久久久久久大奶| 在线观看免费高清a一片| 天天影视国产精品| 最黄视频免费看| 亚洲自偷自拍图片 自拍| 午夜福利一区二区在线看| 久久午夜亚洲精品久久| www日本在线高清视频| 久久天堂一区二区三区四区| 首页视频小说图片口味搜索| 婷婷成人精品国产| 大型av网站在线播放| 午夜免费鲁丝| 久久久国产欧美日韩av| 香蕉国产在线看| 咕卡用的链子| 亚洲成人国产一区在线观看| 首页视频小说图片口味搜索| 每晚都被弄得嗷嗷叫到高潮| 人成视频在线观看免费观看| 色播在线永久视频| 美女高潮喷水抽搐中文字幕| 熟女少妇亚洲综合色aaa.| 久久婷婷成人综合色麻豆| 久久人妻福利社区极品人妻图片| 精品一区二区三区av网在线观看 | 精品熟女少妇八av免费久了| av免费在线观看网站| 亚洲熟妇熟女久久| 成人18禁在线播放| 波多野结衣av一区二区av| a在线观看视频网站| 精品少妇内射三级| 国产91精品成人一区二区三区 | 人妻 亚洲 视频| 在线观看免费午夜福利视频| 在线观看免费视频日本深夜| 免费观看av网站的网址| 亚洲精品av麻豆狂野| 精品一区二区三卡| 亚洲国产av新网站| 欧美日韩黄片免| 狠狠婷婷综合久久久久久88av| 久久影院123| 99国产极品粉嫩在线观看| 老鸭窝网址在线观看| 国产精品电影一区二区三区 | 欧美在线一区亚洲| av电影中文网址| 精品乱码久久久久久99久播| 桃花免费在线播放| 18禁观看日本| 欧美乱妇无乱码| 中文字幕精品免费在线观看视频| 成人免费观看视频高清| 欧美久久黑人一区二区| av在线播放免费不卡| 欧美人与性动交α欧美精品济南到| 五月天丁香电影| 精品少妇黑人巨大在线播放| √禁漫天堂资源中文www| 日韩免费av在线播放| 久久免费观看电影| 高清欧美精品videossex| 亚洲,欧美精品.| 夜夜夜夜夜久久久久| 日韩中文字幕欧美一区二区| 久久久久久久久久久久大奶| 一区二区三区乱码不卡18| 中文字幕av电影在线播放| 亚洲国产看品久久| 久久久久久亚洲精品国产蜜桃av| 天天躁夜夜躁狠狠躁躁| 又大又爽又粗| 9热在线视频观看99| 国产三级黄色录像| 午夜成年电影在线免费观看| 国产一区有黄有色的免费视频| tocl精华| 侵犯人妻中文字幕一二三四区| 亚洲性夜色夜夜综合| 久久影院123| 欧美av亚洲av综合av国产av| 老司机午夜十八禁免费视频| 黄色 视频免费看| 国产成人精品无人区| 久久久久久久久免费视频了| 考比视频在线观看| 人妻久久中文字幕网| 自线自在国产av| 黄片大片在线免费观看| 一二三四在线观看免费中文在| 国产成人精品在线电影| 两个人看的免费小视频| h视频一区二区三区| 性少妇av在线| 久久久久久久国产电影| 国产精品成人在线| 麻豆乱淫一区二区| 色精品久久人妻99蜜桃| 久久精品亚洲熟妇少妇任你| 精品国产乱码久久久久久男人| 91老司机精品| 一进一出好大好爽视频| 亚洲精品国产一区二区精华液| 自线自在国产av| 在线观看免费日韩欧美大片| 水蜜桃什么品种好| 最近最新中文字幕大全免费视频| 99久久99久久久精品蜜桃| 亚洲成av片中文字幕在线观看| 91麻豆av在线| 丁香欧美五月| 国产97色在线日韩免费| 狠狠精品人妻久久久久久综合| 一边摸一边抽搐一进一小说 | 精品国产亚洲在线| 亚洲精品国产精品久久久不卡| 一本一本久久a久久精品综合妖精| 欧美 日韩 精品 国产| 亚洲av第一区精品v没综合| 搡老乐熟女国产| 美女视频免费永久观看网站| 国内毛片毛片毛片毛片毛片| 黑人欧美特级aaaaaa片| 久久久国产欧美日韩av| 中国美女看黄片| 午夜91福利影院| 考比视频在线观看| 亚洲国产av新网站| 国产精品99久久99久久久不卡| av有码第一页| 亚洲精品国产色婷婷电影| 国产老妇伦熟女老妇高清| 亚洲av美国av| 国产精品秋霞免费鲁丝片| 欧美日韩av久久| a级片在线免费高清观看视频| 国产麻豆69| 少妇 在线观看| 国产欧美日韩一区二区三| tocl精华| 老司机亚洲免费影院| 国产高清激情床上av| 一区在线观看完整版| 国产精品熟女久久久久浪| 人妻一区二区av| 一区二区av电影网| 国产精品国产高清国产av | 涩涩av久久男人的天堂| 亚洲国产精品一区二区三区在线| 国产精品 欧美亚洲| 真人做人爱边吃奶动态| 日韩免费av在线播放| 国产一区二区激情短视频| 久久精品成人免费网站| 啦啦啦视频在线资源免费观看| 97在线人人人人妻| 一级,二级,三级黄色视频| 久久99热这里只频精品6学生| www.自偷自拍.com| 免费观看av网站的网址| 多毛熟女@视频| 黄色 视频免费看| 少妇裸体淫交视频免费看高清 | 欧美成人午夜精品| 一级,二级,三级黄色视频| 色老头精品视频在线观看| 狠狠婷婷综合久久久久久88av| netflix在线观看网站| 在线天堂中文资源库| 黄色a级毛片大全视频| 久久天堂一区二区三区四区| 久久影院123| 久久久欧美国产精品| 亚洲avbb在线观看| 两个人免费观看高清视频| 熟女少妇亚洲综合色aaa.| 天天影视国产精品| 欧美日韩亚洲国产一区二区在线观看 | 午夜两性在线视频| 国产精品成人在线| 黄色 视频免费看| 欧美成狂野欧美在线观看| 日韩一区二区三区影片| 欧美日韩成人在线一区二区| 亚洲精品在线美女| 别揉我奶头~嗯~啊~动态视频| 丰满饥渴人妻一区二区三| 汤姆久久久久久久影院中文字幕| 黑人巨大精品欧美一区二区mp4| 久久亚洲真实| 国产精品1区2区在线观看. | 精品国产一区二区久久| 精品亚洲乱码少妇综合久久| 99热网站在线观看| 国产成人系列免费观看| 美女扒开内裤让男人捅视频| 午夜福利一区二区在线看| 欧美国产精品一级二级三级| 亚洲视频免费观看视频| 青青草视频在线视频观看| kizo精华| 精品少妇久久久久久888优播| 亚洲久久久国产精品| 人人妻人人添人人爽欧美一区卜| 国产午夜精品久久久久久| 国精品久久久久久国模美| 19禁男女啪啪无遮挡网站| xxxhd国产人妻xxx| videosex国产| 国产在视频线精品| 国产免费福利视频在线观看| 精品福利观看| 亚洲五月婷婷丁香| 久久久久久久大尺度免费视频| 99香蕉大伊视频| 午夜精品国产一区二区电影| 性少妇av在线| 亚洲欧美色中文字幕在线| 一本—道久久a久久精品蜜桃钙片| 色在线成人网| 婷婷丁香在线五月| 欧美变态另类bdsm刘玥| 国产精品一区二区免费欧美| 国产欧美日韩精品亚洲av| 制服人妻中文乱码| 国产在线观看jvid| 操出白浆在线播放| 国产亚洲一区二区精品| 中国美女看黄片| 午夜激情av网站| 午夜福利一区二区在线看| 热99re8久久精品国产| 欧美成人午夜精品| 69av精品久久久久久 | 一区福利在线观看| 老鸭窝网址在线观看| 99精品在免费线老司机午夜| 午夜福利乱码中文字幕| 亚洲人成伊人成综合网2020| 亚洲中文日韩欧美视频| 久久人人97超碰香蕉20202| 男女下面插进去视频免费观看| 欧美成人免费av一区二区三区 | 窝窝影院91人妻| 国产av一区二区精品久久| 国产精品亚洲av一区麻豆| 每晚都被弄得嗷嗷叫到高潮| 最近最新免费中文字幕在线| 精品久久久精品久久久| 久久久久精品人妻al黑| 久久影院123| 丁香欧美五月| 精品久久久久久电影网| 久久久久久久国产电影| 国产97色在线日韩免费| 美女午夜性视频免费| svipshipincom国产片| 1024香蕉在线观看| 国产精品电影一区二区三区 | 欧美日韩一级在线毛片| 精品亚洲成a人片在线观看| 成年女人毛片免费观看观看9 | 国产欧美亚洲国产| 午夜日韩欧美国产| 免费女性裸体啪啪无遮挡网站| 亚洲一区中文字幕在线| 一区二区三区精品91| 国产一区二区三区视频了| 亚洲一卡2卡3卡4卡5卡精品中文| 极品少妇高潮喷水抽搐| 国产激情久久老熟女| 18禁国产床啪视频网站| 久久 成人 亚洲| av有码第一页| 久久精品亚洲熟妇少妇任你| 汤姆久久久久久久影院中文字幕| 久久久久久人人人人人| 日本撒尿小便嘘嘘汇集6| 法律面前人人平等表现在哪些方面| 无遮挡黄片免费观看| 人妻久久中文字幕网| 国产精品久久久人人做人人爽| 久久久久久久久免费视频了| 国产一区二区激情短视频| 欧美乱码精品一区二区三区| 国产精品久久久久久人妻精品电影 | 欧美激情极品国产一区二区三区| 亚洲av成人一区二区三| 久久久久久久精品吃奶| 中文亚洲av片在线观看爽 | 啦啦啦在线免费观看视频4| 久久国产精品大桥未久av| 99国产精品一区二区三区| 午夜福利在线观看吧| 美女视频免费永久观看网站| 最新在线观看一区二区三区| 午夜福利乱码中文字幕| 久久青草综合色| 露出奶头的视频| 欧美人与性动交α欧美软件| 日韩大码丰满熟妇| 日韩制服丝袜自拍偷拍| 久久久国产一区二区| 亚洲成人国产一区在线观看| 男男h啪啪无遮挡| 亚洲精品中文字幕一二三四区 | 三级毛片av免费| 免费少妇av软件| 国产精品一区二区在线不卡| 亚洲精品粉嫩美女一区| 亚洲 欧美一区二区三区| 一二三四社区在线视频社区8| 操美女的视频在线观看| 久久久久久亚洲精品国产蜜桃av| 丰满饥渴人妻一区二区三| 亚洲av国产av综合av卡| 99精品在免费线老司机午夜| 蜜桃在线观看..| 亚洲国产中文字幕在线视频| 搡老岳熟女国产| 国产精品亚洲av一区麻豆| 国产野战对白在线观看| 久9热在线精品视频| 一边摸一边抽搐一进一出视频| 久久中文字幕人妻熟女| 国产99久久九九免费精品| 国产精品国产av在线观看| 国产精品免费大片| 国产xxxxx性猛交| 国产精品亚洲av一区麻豆| 十八禁高潮呻吟视频| 视频区图区小说| 久久久久久久国产电影| 久久香蕉激情| 亚洲国产欧美日韩在线播放| 久久午夜亚洲精品久久| 国产一区二区三区视频了| 91国产中文字幕| 动漫黄色视频在线观看| 久久精品亚洲av国产电影网| 麻豆乱淫一区二区| 18在线观看网站| 成年版毛片免费区| 99国产精品一区二区蜜桃av | 欧美久久黑人一区二区| 国产伦理片在线播放av一区| 一级毛片电影观看| 国产高清激情床上av| 午夜成年电影在线免费观看| 国产又爽黄色视频| 日韩一卡2卡3卡4卡2021年| 动漫黄色视频在线观看| 亚洲天堂av无毛| 中文字幕制服av| 性少妇av在线| 欧美在线黄色| 免费久久久久久久精品成人欧美视频| 少妇被粗大的猛进出69影院| 自拍欧美九色日韩亚洲蝌蚪91| tube8黄色片| 欧美成人免费av一区二区三区 | 国产精品1区2区在线观看. | 久久久久久久精品吃奶| 一级a爱视频在线免费观看| 嫁个100分男人电影在线观看| 精品国产亚洲在线| 如日韩欧美国产精品一区二区三区| 在线观看免费高清a一片| 另类亚洲欧美激情| 丰满人妻熟妇乱又伦精品不卡| 亚洲国产毛片av蜜桃av| 99热网站在线观看| 啦啦啦免费观看视频1| 久久久久精品国产欧美久久久| 国产xxxxx性猛交| 欧美日韩亚洲国产一区二区在线观看 | 午夜福利在线免费观看网站| 成人18禁在线播放| 免费看a级黄色片| 如日韩欧美国产精品一区二区三区| 成年人午夜在线观看视频| 欧美日韩亚洲综合一区二区三区_| 中文字幕精品免费在线观看视频| 黑丝袜美女国产一区| av不卡在线播放| 夜夜爽天天搞| 一区二区三区激情视频| 超色免费av| 91字幕亚洲| av又黄又爽大尺度在线免费看| 亚洲国产看品久久| 亚洲欧洲日产国产| 国产精品免费一区二区三区在线 | 日韩成人在线观看一区二区三区| 在线av久久热| 丝袜喷水一区| 欧美另类亚洲清纯唯美| 精品一区二区三卡| av一本久久久久| 亚洲久久久国产精品| av国产精品久久久久影院| 久久中文看片网| 啦啦啦中文免费视频观看日本| 亚洲精品粉嫩美女一区| 少妇猛男粗大的猛烈进出视频| 国产精品自产拍在线观看55亚洲 | 免费在线观看视频国产中文字幕亚洲| 大型av网站在线播放| 亚洲专区国产一区二区| 男人操女人黄网站| www.自偷自拍.com| 18禁裸乳无遮挡动漫免费视频| 日本欧美视频一区| 亚洲精品国产色婷婷电影| 十八禁网站免费在线| 99久久99久久久精品蜜桃| 日本a在线网址| 亚洲欧美日韩另类电影网站| 男人操女人黄网站| 精品亚洲乱码少妇综合久久| 亚洲第一欧美日韩一区二区三区 | 最新的欧美精品一区二区| 悠悠久久av| 最新在线观看一区二区三区| 成年动漫av网址| 精品一区二区三区av网在线观看 | av电影中文网址| 成人黄色视频免费在线看| 美女午夜性视频免费| 大片免费播放器 马上看| 91国产中文字幕| 国产淫语在线视频| 亚洲av片天天在线观看| 欧美乱妇无乱码| 国产不卡一卡二| 男男h啪啪无遮挡| 成人手机av| av欧美777| 91精品国产国语对白视频| 亚洲人成电影观看| 美女午夜性视频免费| 亚洲国产欧美网| 国产av又大| 国产免费av片在线观看野外av| 欧美日韩国产mv在线观看视频| 香蕉国产在线看| 欧美黑人精品巨大| 久久精品国产综合久久久| 波多野结衣一区麻豆| 国产精品香港三级国产av潘金莲| 精品国产国语对白av| 麻豆国产av国片精品| 国产精品国产av在线观看| 免费日韩欧美在线观看| 日本wwww免费看|