• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A Support Data-Based Core-Set Selection Method for Signal Recognition

    2024-04-28 11:59:34YangYingZhuLidongCaoChangjie
    China Communications 2024年4期

    Yang Ying ,Zhu Lidong ,Cao Changjie

    1 National Key Laboratory of Wireless Communications(University of Electronic Science and Technology of China),Chengdu 611731,China

    2 College of Computer Science and Cyber Security(Chengdu University of Technology),Chengdu 610059,China

    3 College of Mathmatics and Physics(Chengdu University of Technology),Chengdu 610059,China

    Abstract: In recent years,deep learning-based signal recognition technology has gained attention and emerged as an important approach for safeguarding the electromagnetic environment.However,training deep learning-based classifeirs on large signal datasets with redundant samples requires signifciant memory and high costs.This paper proposes a support databased core-set selection method(SD)for signal recognition,aiming to screen a representative subset that approximates the large signal dataset.Specifcially,this subset can be identifeid by employing the labeled information during the early stages of model training,as some training samples are labeled as supporting data frequently.This support data is crucial for model training and can be found using a border sample selector.Simulation results demonstrate that the SD method minimizes the impact on model recognition performance while reducing the dataset size,and outperforms fvie other state-of-the-art core-set selection methods when the fraction of training sample kept is less than or equal to 0.3 on the RML2016.04C dataset or 0.5 on the RML22 dataset.The SD method is particularly helpful for signal recognition tasks with limited memory and computing resources.

    Keywords: core-set selection;deep learning;model training;signal recognition;support data

    I.INTRODUCTION

    The rapid growth of satellite,wireless communications,and IoT technology has resulted in a dramatic increase in the number and diversity of wireless signals accessing the electromagnetic spectrum,seriously threatening the security of electromagnetic environments.Therefore,it is important to effectively identify or monitor these wireless signals.In recent years,deep learning has gained considerable attention and has been successfully employed for various signal recognition tasks like spectrum monitoring [1,2],interference identifciation [3,4] and outlier detection [5,6],establishing itself as a crucial tool in maintaining the security of the electromagnetic environment.

    Most deep neural networks (DNNs) are trained on large datasets to reach advanced levels,which is extremely demanding in terms of memory and computation [7-12].Large datasets contain redundant samples,and pre-screening valuable ones can effectively reduce the training cost of recognition models.However,determining which samples are worth retaining and which ones should be removed is a challenging task for signal datasets.

    Core-set selection technique has been widely used to identify a representative subset from a large dataset,and models trained on these subsets exhibit similar generalization performance to those trained on original datasets.Moreover,the work of [13-15] clearly pointed out that the training samples close to the decision boundary are very important to speed up the model convergence.The representative subset is a data subset that includes these important samples as the main body.However,the implicit nature of the decision boundary in DNNs makes it diffciult to directly fnid these samples that are located near the decision boundary.Therefore,many classical core-set selection methods[16,17]screen the representative subset by calculating various indirect parameters such as geometric properties or distances from class centers,but most of them require the use of a well-trained model.

    In contrast,this paper proposes a novel support databased core-set selection method(SD)for signal recognition,which aims to screen a representative subset that can effectively replace the large signal dataset.The representative subset consists of those training samples that are frequently identifeid as support data during the early stages of model training.To this end,we employ a border sample selector to accurately label these support data.Experiments were conducted on the public RML2016.04C and RML22 signal datasets,and we replicated fvie other state-of-the-art core-set selection methods to adapt them for signal recognition tasks.The simulation results demonstrate that our SD method has a great competitive advantage,especially when the fraction of data kept is less than 0.3 on the RML2016.04C dataset or 0.5 on the RML22 dataset,our SD method outperforms fvie other advanced coreset methods.This innovative work can potentially benefti numerous applications with limited computing capabilities and memory.

    The contributions of this paper are summarized as follows:

    ? We constructed a border sample selector to learn the knowledge of the output vector set from the signal classifeir,which accurately identifeis support data located near decision boundaries.

    ? We designed a strategy to evaluate the importannce of training samples by counting the frequency with which each sample is labeled as support data during the early stages of model training,in order to identify important samples for constructing a representative subset.

    ? We reproduced fvie other state-of-the-art core-set selection methods to adapt them for signal recognition tasks.

    ? Experimental results confrim that our SD method exhibits excellent performance.When the fraction of data kept is less than 0.3 on the RML2016.04C dataset or 0.5 on the RML22 dataset,it outperforms fvie other core-set selection methods.

    The rest of this paper is organized as follows.Section II introduces the related work on core-set selection methods.Section III studies the principle of our SD method for the signal recognition task.In Section IV,a series of experimental results are employed to show the effectiveness of the SD method.Finally conclusions are drawn in Section V.

    II.RELATED WORK

    Many core-set methods have demonstrated the presence of redundant samples within large datasets,indicating the potential for further reduction in their size.Employing a smaller dataset can effectively mitigate the cost of model training.Prior research in this area can be broadly categorized into two aspects: synthetic sample generation and raw subset selection.

    The former work distills the knowledge from the entire training set into a few synthetic samples generated by generative models like VAEs and GANs,which can serve as substitutes for the original training set in training recognition models.Representative research is dataset distillation [18-20].However,the training of generative models is diffciult and time-consuming.

    The latter work employs various selection strategies to screen a valuable subset of the raw training set,thereby reducing its size.Representative research is active learning [15,21,22] and core-set selection[16,17,23-25].Some studies [13-15] have demonstrated the effectiveness of utilizing border samples in proximity to decision boundaries to accelerate model convergence,but fniding border samples directly remains challenging.Various core-set methods employed different indirect parameters to label border samples,including geometric properties[16],distance from class center[17]and gradient/disturbance change[15,25].For example,Max Welling[17]selected border samples based on their distance from the class center in the feature space.The larger the distance value,the further away the sample is from its corresponding class center and closer to the decision boundary.Ducoffe et al.[15] labeled border samples by generating adversarial samples,that is,converting the distance of each sample to the decision boundary into the amount of perturbation.The amount of perturbation added by changing the sample label is small,indicating its proximity to the decision boundary.All the aforementioned methods can effectively screen a representative subset by using a well-trained model,but they may encounter limitations when using an earlystage trained model.

    At present,there are few core-set selection methods based on the early information of model training.Paul et al.[24] defnied the sample contribution for model training as the reduction in loss of all other samples on a single gradient step,and dynamically identifeid important samples during the early stages of model training.Furthermore,Toneva et al.[23]proposed that the model tends to forget samples near the decision boundary during training.Therefore,authors tracked the number of times a sample transitions from correct to wrong classifciation and selected important samples.Inspired by this,this paper proposes a novel support data-based core-set selection method for signal recognition.It evaluated the importance of each sample based on the information provided by the constructed border sample selector,which learned knowledge from the output vectors of the signal classifeir during the early stages of model training.Subsequently,samples were selected to construct the representative subset based on the importance.The detailed description is presented in the Section III.

    III.A NOVEL CORE-SET SELECTION METHOD

    This section presents the problem description of the core-set selection method for signal recognition tasks,followed by an elaboration on the model framework and the principle of our SD method.It concludes with a detailed exposition of implementation specifcis.

    3.1 Problem Description

    The development of communication and IoT technology has resulted in a signifciant increase in the signals that need to be identifeid in the electromagnetic environment.Therefore,pre-screening a representative subset for model training is important,as it can effectively reduce the size of large datasets while minimizing the impact on the model’s recognition performance.

    Similar to [26],this work gives a large training set of raw signalsincludingKsignal classes,Nsignal samples andyis the signal sample’s corresponding label.Core-set selection technology aims to formulate a selection strategy to fnid the most valuable subsetfrom the large signal datasetDtrain,whereMis the size ofdtrain,M <<N.The outputs of the model trained ondtrainare as near to the results of the model trained onDtrainas feasible.The raw signalx(t)can be formalized as[27,28]:

    where,s(t)is the transmitted baseband signal,c(t)is the impulse response of the transmission channel,n(t)is the Additive White Gaussian Noise (AWGN),andx(t)represents the baseband wireless signal.

    3.2 Support Data-based Core-Set Selection

    In the early stages of signal classifeir training,this paper proposes a novel support data-based core-set selection(SD)method to identify a representative subsetdtrainfrom a large signal datasetDtrain.It involves two steps: labeling and screening.The principles behind these steps are explained in detail as follows.

    3.2.1 Labeling Important Samples

    The decision boundary of a DNN-based signal classifeir is implicit,posing challenges in directly identifying important samples near decision boundaries.To address this issue,we have constructed a border sample selector trained on the feature vector set of the signal classifeir to approximate decision boundaries and screen important samples.As shown in Figure 1,our used model framework consists of two key components: a signal classifeir and a border sample selector,which are described in detail below.

    Figure 1.The schematic diagram of the used model framework.It consists of a signal classifier and a sample selector.

    The DNN-based signal classifeir aims to extract the distinctive features of signals and generate accurate classifciation predictions.Here,its model architecture refers to the public signal recognition model VT-CNN2 in [29],which consists of two convolutional layers and two fully connected layers,each containing 256 convolution kernels and 80 convolution kernels respectively,and a ReLU activation function is introduced at the end of each convolutional layer.The two fully connected layers contain 256 andKneurons,respectively,whereKis the number of signal categories,and ReLU and Softmax are used as activation functions for the two fully connected layers,respectively.The Softmax activation function converts feature vectors into discrete probabilities.Specifcially,the feature vector set of the training samplesafter one SGD training of the signal classifeir can be written asZ=whereNis the number of training samples,ei=φ(xi),ei ∈R1×Tandw ∈RT×Kare thei-th input andi-th weight parameter of the classifeir’s last fully connected layer,Kis the classifeir’s output dimensions(number of signal classes).And thei-th output vector is converted by the Softmax function to as follows.

    where,pi,kis the probability that a signal samplexiis predicted to be classzi,kis the output of thek-th node of the last fully connected layer,andwkis the weight of thek-th node of the last fully connected layer.

    The cross-entropy loss is used as our signal classifeir’s optimization function,and it can be formalized as follows.

    where,θis the parameters of the signal classifeir,NandKare the number of samples and categories in the signal training set,respectively,yi,kis a symbolic function,the value is 1 whenyi=k,otherwise the value is 0.

    To screen important samples near decision boundaries,a border sample selector withKsupport vector machines (SVMs) has been constructed to learn the knowledge from the feature vector setZ=of the signal classifeir,as shown in Figure 2 and Figure 3.Since the feature vector set is linearly separable(see Figure 2(a)),this paper uses a simple linear SVM’s hyperplane equation to approximate the decision boundary between two classes of the signal classifeir (see Figure 2(b-d)),drawing inspiration from the work of [30,31].Following the one-vs-rest voting scheme[32,33],Kbinary-SVMs are used for constructing multi-class signal boundaries and labeling samples located near the decision boundary (see Figure 2(e)).

    Figure 2.The schematic diagram of the inputs and outputs of the border sample selector. (a) input of the border sample selector,(b)-(d)results of three linear SVMs,(d)results of the border sample selector.

    Figure 3.The flow chart of the support data-based core-set selection method for signal recognition.

    The hyperplane equation of a linear SVM in the border sample selector can be written as follows.

    where,ais the normal vector,which determines the direction of the hyperplane,bis the displacement,which determines the distance between the hyperplane and the training samplezi.The hyperplane equation is determined by the normal vectoraand the displacementb.

    On this basis,the support vectorsT ?Nin close proximity to the hyperplane can be found directly after the parametersaandbare solved.Each support vector (zi,yi) is the point that lies on the boundary of the maximum margin and meets the following criterion:

    The margin is defnied as the projection of the difference between two classes of support vector ontoa,and the hyperplane is located in the middle of the margin.

    However,in order to fnid the hyperplane with the largest margin,the parametersaandbneed to satisfy the following constraint:

    Clearly,the training objective is to maximize the||a||-1,which is equivalent to minimizing||a||2.Thus the equation(6)can be rewritten as:

    We set a slack variableξiand a penalty parameterCfor each training sample,the optimization problem of the border sanmple selector is to minimizeLs:

    Here,after obtaining the optimal solutions foraandb,we approximately label the feature vectors located near the decision boundary by utilizing support vectors coordinate provided by equation (5).The important samples are the raw signal samples corresponding to these feature vectors.

    3.2.2 Screening the Representative Subset

    As signal classifeir training progresses,we have observed that some signal samples tend to appear in close proximity to decision boundaries and are frequently labeled as support vectors by our border sample selector.Similar phenomena are also described in[23].To this end,the authors defnied the concept of forgetting events for each training sample,and proved that training samples with a high frequency of forgetting events are important for effective classifeir training.Inspired by it,this paper proposes a novel selection strategy for screening a representative subset,and the implementation steps are as follows.

    As shown in Figure 3,we frist record the frequency of each training sample labeled as a support vector during the 20 pre-training epochs of the signal classifeir,which serves as the basis for evaluating the importance of each training sample in signal clasiifeir training.Then,we sort all training samples in descending order based on their labeled frequency,with higher frequency values indicating greater importance for the signal classifeir.Finally,the representative subset consists of the fristMtraining samples,whereM=N*fandfis the kept fraction value,f ∈[0,1].More detailed explanations of our SD method is further elaborated in Algorithm 1.

    We contend that the core idea of computing forgetting event statistics and computing labeled frequencies is consistent,as both rely on early model training information to fnid important training samples near decision boundaries.The distinction lies in our SD method to directly label these samples through a border sample selector.

    3.2.3 Implementation Details

    All experiments were conducted on two publicly available signal datasets RML2016.04C [34,35] and RML22 [36].The RML2016.04C dataset consists of three analog modulations and eight digital modulations,with each modulation corresponding to 20 SNR values ranging from -20dB to +18dB at intervals of 2dB.The data format of each signal sample is 1×2×128.The RML22 dataset contains two analog modulations (excluding AM-SSB) and eight digital modulations,with other parameters similar to the RML2016.04C dataset.A more detailed description is provided in[34,36].To mitigate the influence of noise on simulation results,we used signal samples within the SNR range of[+0dB,+18dB]to construct the experimental dataset.As a result,the RML2016.04C dataset utilized in this study comprises 51,040 signals,while the RML22 dataset includes 126,000 signals.Each modulation within each dataset is divided into a training set,validation set and test set in a ratio of 7:2:1 for subsequent experiments.

    As previously mentioned in Figure 1,our SD method utilized a model framework consisting of a signal classifeir and a border sample selector.For the signal classifeir,we used the SGD optimizer with a learning rate of 0.001 and 0.01 (momentum=0.9,weight decay=5e-4) to perform 20 pre-training and 350 optimization training on both RML2016.04C dataset and RML22 dataset,while setting the batch size to 64.The border sample selector consists ofKlinear SVMs that are constructed using the SVC function from the Sklearn Library,with a penalty parameterCset to 0.9.All experimental results are averaged over fvie runs.

    IV.EXPERIMENTS AND RESULTS

    This paper proposes a novel support data-based coreset selection method for signal recognition,aiming to screen a representative subset from the large dataset in the early stages of model training.Therefore,simulation experiments are conducted in this section need to demonstrate the following aspects: The SD method can accurately identify training samples near the decision boundary,validate the effectiveness selected subset,and exhibit superior performance compared to fvie other advanced core-set selection methods.

    4.1 Effectiveness of the Border Sample Selector

    The implicit nature of decision boundaries in DNNs presents challenges for screening training samples located near the decision boundary.To address this issue,this paper proposes an innovative approach to train a border sample selector based on the feature vector set of the signal classifeir,which allows for direct labeling of important samples.Consequently,the frist experiment demonstrates the effectiveness of the border sample selector.

    The multi-class border sample selector is extended from multi binary selectors based on the one-vs-rest voting scheme [32,33].For convenience,we conducted a simulation experiment on the training set consisting of the frist two modulations (8PSK and AMDSB)within the RML2016.04C dataset and presented the labeling result for the border sample selector,as shown in Figure 4.

    Figure 4.The visualization outcome of the important sample selection. Left: visualization of the output vector from the signal classifier. Right: visualization of the labeling result based on the border sample selector.

    In the signal’s feature space,it is apparent that the border sample selector approximates the decision boundary of the signal classifeir as a linear function and effectively identifeis feature vectors (yellow data points) close to this boundary.These yellow data points correspond to raw signal samples labeled as important samples,providing evidence for the effectiveness of our SD method’s border sample selector in accurately identifying such signal samples.

    4.2 The Representative Subset Selection

    During the frist 20 epoch of model pre-training,we have observed that some signal samples frequently appear in close proximity to the decision boundary,leading our border sample selector labeling them as support vectors multiple times.In the second experiment,we fully utilized this information to identify training samples that are conducive to model training.

    In the RML2016.04C dataset,we counted the number of times each training sample was labeled by the border sample selector during the frist 20 pre-training processes,as shown in Figure 5.It can be seen that the number of labeled times for each training sample is different during the early stage of model training,indicating that the occurrence frequency of each training sample near decision boundaries differs.We believe that such frequency information is closely related to the contribution made by training samples towards model learning.Particularly,some training samples(the dark blue and blue data points)have been labeled as border samples multiple times,and we posit that they hold greater signifciance in model training than others.

    Figure 5.The labeled times of the first 50 training samples during the 20 pre-training epochs on the RML2016.04C dataset.

    Additionally,we investigated the impact of varying pre-training epochs on the recognition accuracy of the signal classifeir for the RML2016.04C dataset.Given that only 30% of the training samples are retained,Figure 6 illustrates that increasing the number of pretraining epochs leads to higher recognition accuracy for signal classifeirs trained using selected representative subsets,indicating their greater value.It is in line with our expectations.In this paper,the pre-training epoch of the signal classifeir is set to 20.

    Figure 6.The influence of different pre-training epochs on the test accuracy of the signal classifier.

    4.3 Compare with Other Core-Set Methods

    To demonstrate the advantages of our SD method,we reproduced fvie other state-of-the-art core-set methods and adapted them for signal recognition tasks,as shown in Table 1.Subsequently,we compared the performance of our SD method with these advanced core-set methods on two publicly available datasets,RML2016.04C and RML22.‘No pruning’method utilizes the entire training set to train the signal classifeir,and it is anticipated that this method will yield the highest recognition accuracy.The methods of ‘Herding’ [17],‘DeepFool’ [15] and ‘Cal’ [37] require a well-trained classifeir for core-set selection,while the methods of‘Craig’[25]and‘Forgetting’[23]only require an early-stage trained classifeir to achieve coreset selection.

    Table 1.Comparison analysis of the basic principles of different core-set selection methods.

    In this paper,the VT-CNN2 model[29]is utilized as the signal classifeir,and the simulation experiment has the following settings.Specifcially,we recorded the gradient change,the number of forgetting events,and the number of labeled times for each training sample during the 20 pre-training epochs of the signal classifeir for ‘Craig’ [25],‘Forgetting’ [23],and our SD method.These metrics were used to evaluate the contribution of signal samples to classifeir training.This study focuses on selecting a representative subset for signal datasets during the early stage of model training to reduce dataset size and expedite subsequent training.Therefore,taking into account control variables,we employed the signal classifeir obtained after 20 pre-training epochs to replace the well-trained classifeir for executing core-set selection methods such as‘Herding’[17],‘DeepFool’[15],and‘Cal’[37].

    In Figure 7,we compared the recognition performance of the signal classifeir after training on subsets with different sizes selected by various core-setselection methods.‘No pruning’ method exhibits superior recognition accuracy as anticipated,owing to its utilization of the complete signal training set and acquisition of comprehensive information.Its performance can be regarded as the upper limit for signal classifeirs to learn from current signal datasets.All other core-set selection methods have varying degrees of performance degradation as the fraction of data kept changes.In particular,the recognition performance of the‘Herding’[17],‘DeepFool’[15],and‘Cal’[37]methods is unstable due to their limited ability to leverage early-stage model training information for dataset screening.The recognition performance of the two dynamic screening methods,‘Craig’[25]and‘Forgetting’ [23],exhibits a gradual improvement as the fraction of data kept increases.Moreover,when the fraction exceeds 0.7 in the RML2016.04C dataset or surpasses 0.5 in the RML22 dataset,their performance closely approximates that achieved by the‘No Pruning’method.In contrast,the performance of our SD method outperforms other core-set selection methods on two distinct signal datasets,and it closely approaches the performance of ‘No pruning’ when the fraction is less than 0.3 in the RML2016.04C dataset or less than 0.5 in the RML22 dataset.This experiment clearly demonstrates the effectiveness of our proposed SD method.

    Figure 7.Performance advantages of the proposed SD method on different signal datasets.

    In addition,we demonstrated the performance advantages of the SD method by comparing the minimum training sample size required to achieve certain recognition accuracy requirements across different core-set selection methods as shown in Figure 8.The results depicted in Figure 8(a)-(b) demonstrate that,on the RML2016.04C dataset,the SD method uses the fewest training samples to achieve a recognition accuracy requirement of 85%or 90%.This indicates its superior ability to retain relevant information while eliminating redundant or irrelevant samples.It uses 22.7% (30.5%-7.8%) and 5% (35.5%-30.5%)fewer training samples respectively than the suboptimal‘Forgetting’method[23].Similarly,according to Figure 8(c),the SD method requires the fewest training samples to achieve an 80% recognition accuracy on the RML22 dataset.Remarkably,this method uses 23.9%(47.9%-24%)fewer training sample compared to the suboptimal ‘Forgetting’ method [23].In summary,the SD method is an excellent core-set selection method,especially when the fraction of training samples kept is low,which can quickly enable the signal classifeir to achieve good performance.

    Figure 8.Comparison of training sample sizes required by different core-set selection methods.

    V.CONCLUSION

    In this paper,a novel support data-based core-set selection method is proposed for signal recognition,which can reduce the signal dataset size while minimizing the impact on signal classifeir’s recognition performance.This method is based on the premise that the important data points for model training are situated in close proximity to the decision boundary.Specifcially,fristly,we trained a border sample selector on the feature vector set of the signal classifeir to directly label important samples.Secondly,we observed that some training samples frequently appear in the support vector set during classifeir training.Based on this,we proposed a novel selection strategy to evaluate the importance of each training sample for model training.Finally,we utilized theMmost frequently labeled samples to construct a representative subset approximating the entire signal dataset.Experimental results on the public RML2016.04C and RML22 datasets have demonstrated that our SD method outperforms fvie other state-of-art coreset methods by effectively balancing the trade-off between reducing the size of the training set and maintaining high recognition accuracy of the classifeir model.Specifcially,to achieve the 85%recognition accuracy requirement on the RML2016.04C dataset,our SD method requires 22.7% less training samples than the fvie other advanced core-set selection methods.To achieve the 85%recognition accuracy requirement on the RML22 dataset,the usage of training samples is reduced by 23.9%.In addition,our SD method exhibits superior performance on the RML2016.04C dataset when retaining less than 0.3 of the data and on the RML22 dataset when retaining less than 0.5.It is suitable for signal recognition tasks in edge scenarios that prioritize limited memory and computation over high recognition accuracy.Future research could explore method’s specifci applications in edge scenarios or its potential utilization in memory-limited tasks,such as continual learning.

    ACKNOWLEDGEMENT

    This work is fully supported by National Natural Science Foundation of China (62371098),Natural Science Foundation of Sichuan Province (2023NSFSC1422),National Key Research and Development Program of China (2021YFB2900404),and Central Universities of South west Minzu University(ZYN2022032).

    成人三级做爰电影| 精品一区二区三区视频在线观看免费| 国产一区在线观看成人免费| 亚洲av熟女| 老司机午夜十八禁免费视频| 中文字幕久久专区| 老汉色∧v一级毛片| 久久狼人影院| 黄频高清免费视频| 老司机午夜福利在线观看视频| 9191精品国产免费久久| 午夜福利在线在线| 97超级碰碰碰精品色视频在线观看| 亚洲欧洲精品一区二区精品久久久| 久久久久国产精品人妻aⅴ院| 免费看a级黄色片| 视频区欧美日本亚洲| 午夜精品久久久久久毛片777| 欧美久久黑人一区二区| 一级黄色大片毛片| 国产精品免费一区二区三区在线| 禁无遮挡网站| 人人澡人人妻人| 国产精品98久久久久久宅男小说| 国产精品香港三级国产av潘金莲| 一个人观看的视频www高清免费观看 | 两个人看的免费小视频| 亚洲成人免费电影在线观看| 亚洲国产欧洲综合997久久, | 天天躁夜夜躁狠狠躁躁| 视频区欧美日本亚洲| 国产精品美女特级片免费视频播放器 | 免费看十八禁软件| 法律面前人人平等表现在哪些方面| 无限看片的www在线观看| 精品人妻1区二区| 亚洲精品色激情综合| 18美女黄网站色大片免费观看| 一卡2卡三卡四卡精品乱码亚洲| 亚洲自偷自拍图片 自拍| 最好的美女福利视频网| 亚洲欧美一区二区三区黑人| 午夜成年电影在线免费观看| 亚洲 欧美 日韩 在线 免费| 变态另类丝袜制服| 很黄的视频免费| 中文字幕高清在线视频| 色在线成人网| 亚洲aⅴ乱码一区二区在线播放 | 亚洲中文字幕日韩| 亚洲久久久国产精品| 给我免费播放毛片高清在线观看| 丁香六月欧美| 日本五十路高清| 亚洲av电影在线进入| 美女 人体艺术 gogo| 日日摸夜夜添夜夜添小说| 一区二区三区高清视频在线| 嫩草影院精品99| 老司机午夜十八禁免费视频| 欧美乱码精品一区二区三区| 啦啦啦免费观看视频1| 成人午夜高清在线视频 | 亚洲男人天堂网一区| www.www免费av| 亚洲avbb在线观看| 中文亚洲av片在线观看爽| 国产一卡二卡三卡精品| 久久香蕉激情| 搡老熟女国产l中国老女人| 久久久久亚洲av毛片大全| 亚洲狠狠婷婷综合久久图片| 精品日产1卡2卡| www日本在线高清视频| 精华霜和精华液先用哪个| 日日干狠狠操夜夜爽| 欧美一级毛片孕妇| 成年版毛片免费区| 美女高潮喷水抽搐中文字幕| 欧美色视频一区免费| 日韩大尺度精品在线看网址| 免费在线观看影片大全网站| 国产精品精品国产色婷婷| xxxwww97欧美| 欧美日韩精品网址| 1024香蕉在线观看| 日韩有码中文字幕| 久久久久免费精品人妻一区二区 | 麻豆成人午夜福利视频| 搡老熟女国产l中国老女人| 日韩成人在线观看一区二区三区| 男女视频在线观看网站免费 | 在线观看66精品国产| 久久中文看片网| 国产亚洲av嫩草精品影院| 91老司机精品| 人人妻,人人澡人人爽秒播| 天天添夜夜摸| 久久久久精品国产欧美久久久| 在线观看免费视频日本深夜| 久久久久久国产a免费观看| 非洲黑人性xxxx精品又粗又长| 国产亚洲精品综合一区在线观看 | 免费女性裸体啪啪无遮挡网站| 亚洲欧美精品综合久久99| 成年版毛片免费区| 日韩三级视频一区二区三区| 亚洲欧美激情综合另类| 亚洲av五月六月丁香网| 免费高清视频大片| 夜夜夜夜夜久久久久| 欧美日韩瑟瑟在线播放| 亚洲男人天堂网一区| 免费在线观看黄色视频的| 人人妻人人看人人澡| 2021天堂中文幕一二区在线观 | 日本黄色视频三级网站网址| 成人亚洲精品一区在线观看| 老司机福利观看| 国产精品一区二区精品视频观看| 在线天堂中文资源库| 国产麻豆成人av免费视频| 中文资源天堂在线| 久久99热这里只有精品18| 香蕉国产在线看| 热re99久久国产66热| 亚洲美女黄片视频| 久久久久国内视频| 中文字幕高清在线视频| 男女做爰动态图高潮gif福利片| 午夜福利视频1000在线观看| 欧美乱码精品一区二区三区| 国产精品久久久人人做人人爽| 制服诱惑二区| 人人妻人人看人人澡| 一级a爱片免费观看的视频| 欧美黑人巨大hd| 久久国产精品人妻蜜桃| 亚洲真实伦在线观看| 国产成人精品无人区| 制服人妻中文乱码| 色在线成人网| 午夜精品在线福利| 亚洲自偷自拍图片 自拍| 国产高清激情床上av| 青草久久国产| netflix在线观看网站| 日韩 欧美 亚洲 中文字幕| 久久久久九九精品影院| 国产亚洲精品av在线| www日本黄色视频网| 视频在线观看一区二区三区| 亚洲国产日韩欧美精品在线观看 | 午夜免费鲁丝| 亚洲欧美日韩无卡精品| 欧美色视频一区免费| 久久精品国产综合久久久| 男人舔女人下体高潮全视频| 在线观看日韩欧美| 白带黄色成豆腐渣| 久久久久久国产a免费观看| 狂野欧美激情性xxxx| 黄色成人免费大全| 亚洲av日韩精品久久久久久密| 成人三级做爰电影| 好男人电影高清在线观看| 一夜夜www| 制服人妻中文乱码| 国产在线观看jvid| 午夜福利视频1000在线观看| 日本熟妇午夜| 女性生殖器流出的白浆| 色尼玛亚洲综合影院| 久久婷婷成人综合色麻豆| 国产精品av久久久久免费| www.熟女人妻精品国产| 久久人人精品亚洲av| av超薄肉色丝袜交足视频| 欧美一级a爱片免费观看看 | a级毛片在线看网站| 亚洲欧美日韩高清在线视频| 男人舔女人下体高潮全视频| x7x7x7水蜜桃| 动漫黄色视频在线观看| 欧美在线一区亚洲| 99国产精品99久久久久| 18禁国产床啪视频网站| 亚洲成a人片在线一区二区| 黄色成人免费大全| 日本免费a在线| 久久午夜综合久久蜜桃| 搡老岳熟女国产| 青草久久国产| 亚洲专区字幕在线| 美女高潮喷水抽搐中文字幕| 99热这里只有精品一区 | 中文字幕久久专区| 国产一区在线观看成人免费| 非洲黑人性xxxx精品又粗又长| 国产精品一区二区精品视频观看| 俄罗斯特黄特色一大片| 国产人伦9x9x在线观看| 中文字幕人成人乱码亚洲影| 三级毛片av免费| 精品福利观看| 午夜亚洲福利在线播放| 国产精品野战在线观看| 久久久久久久久久黄片| 久久久久久亚洲精品国产蜜桃av| 久久香蕉精品热| 91成年电影在线观看| 国产av一区在线观看免费| 男人舔女人的私密视频| 两性夫妻黄色片| 亚洲一区二区三区不卡视频| 亚洲精品粉嫩美女一区| 日韩欧美国产在线观看| 久久国产精品男人的天堂亚洲| 国产亚洲av嫩草精品影院| 亚洲自偷自拍图片 自拍| 亚洲熟妇中文字幕五十中出| 亚洲国产精品久久男人天堂| 国产在线观看jvid| 精品熟女少妇八av免费久了| 一区二区三区激情视频| 日日干狠狠操夜夜爽| 两个人看的免费小视频| av欧美777| 亚洲一卡2卡3卡4卡5卡精品中文| 中文资源天堂在线| 久久久久久国产a免费观看| 婷婷丁香在线五月| 日韩免费av在线播放| 亚洲精华国产精华精| 99国产精品99久久久久| 久久中文看片网| 国产单亲对白刺激| 久久精品人妻少妇| 亚洲色图 男人天堂 中文字幕| 日韩国内少妇激情av| 国产黄a三级三级三级人| 欧美日韩亚洲国产一区二区在线观看| 神马国产精品三级电影在线观看 | 最近最新中文字幕大全电影3 | 日韩有码中文字幕| 国产一区二区三区视频了| 亚洲av第一区精品v没综合| 一级毛片女人18水好多| 午夜a级毛片| 黄片大片在线免费观看| 精品久久久久久久人妻蜜臀av| 亚洲欧美日韩无卡精品| 免费看十八禁软件| 亚洲在线自拍视频| 妹子高潮喷水视频| 国产精品一区二区三区四区久久 | 中文字幕最新亚洲高清| 国产精品国产高清国产av| 久热这里只有精品99| 大型av网站在线播放| 国产亚洲欧美精品永久| 亚洲午夜精品一区,二区,三区| 国产不卡一卡二| 欧美zozozo另类| 欧洲精品卡2卡3卡4卡5卡区| 久久久久国产精品人妻aⅴ院| avwww免费| 中国美女看黄片| 免费在线观看日本一区| 丰满的人妻完整版| 日本在线视频免费播放| 国内揄拍国产精品人妻在线 | 国产精品综合久久久久久久免费| 91国产中文字幕| 色老头精品视频在线观看| 露出奶头的视频| 国产精品一区二区免费欧美| 91麻豆精品激情在线观看国产| 亚洲第一欧美日韩一区二区三区| 国产三级在线视频| 国内精品久久久久久久电影| 大型黄色视频在线免费观看| 久9热在线精品视频| 在线观看66精品国产| 精品免费久久久久久久清纯| 亚洲av片天天在线观看| 久久香蕉国产精品| 成人一区二区视频在线观看| 亚洲精品色激情综合| 亚洲国产精品久久男人天堂| av欧美777| 性欧美人与动物交配| 国产精品综合久久久久久久免费| 99在线视频只有这里精品首页| 精品不卡国产一区二区三区| 午夜福利在线观看吧| 日本熟妇午夜| 一进一出好大好爽视频| 久久伊人香网站| 精华霜和精华液先用哪个| 老司机午夜十八禁免费视频| 亚洲精品在线观看二区| 国产极品粉嫩免费观看在线| 亚洲国产欧美日韩在线播放| 1024手机看黄色片| 999久久久精品免费观看国产| 母亲3免费完整高清在线观看| 熟妇人妻久久中文字幕3abv| 午夜亚洲福利在线播放| 免费在线观看视频国产中文字幕亚洲| 黄色视频,在线免费观看| 日日爽夜夜爽网站| 女生性感内裤真人,穿戴方法视频| 人人澡人人妻人| 精品国产美女av久久久久小说| 精品欧美一区二区三区在线| 搡老岳熟女国产| 最新在线观看一区二区三区| 女人高潮潮喷娇喘18禁视频| 国产视频内射| 婷婷精品国产亚洲av在线| 国产欧美日韩一区二区精品| 少妇粗大呻吟视频| 欧美人与性动交α欧美精品济南到| 成人特级黄色片久久久久久久| 亚洲国产欧美一区二区综合| 一级a爱视频在线免费观看| 99国产精品一区二区蜜桃av| 亚洲色图av天堂| 久久精品国产亚洲av高清一级| 99久久99久久久精品蜜桃| 久久久久久久午夜电影| 亚洲中文日韩欧美视频| 免费高清在线观看日韩| 9191精品国产免费久久| 丝袜美腿诱惑在线| 国产亚洲精品一区二区www| 国产一区二区三区在线臀色熟女| 韩国av一区二区三区四区| 国产精品九九99| 亚洲 欧美一区二区三区| 亚洲av美国av| 人人澡人人妻人| 黄片小视频在线播放| 国产精品 欧美亚洲| 男人舔女人下体高潮全视频| 国产久久久一区二区三区| 久久狼人影院| 久久伊人香网站| 一区二区日韩欧美中文字幕| 亚洲第一电影网av| 欧美午夜高清在线| 制服人妻中文乱码| 国产精品一区二区三区四区久久 | 国产主播在线观看一区二区| 国内久久婷婷六月综合欲色啪| 欧美黑人精品巨大| 中文字幕最新亚洲高清| 99精品欧美一区二区三区四区| 精品久久久久久久久久久久久 | 少妇的丰满在线观看| 亚洲真实伦在线观看| 99久久精品国产亚洲精品| 女人高潮潮喷娇喘18禁视频| 免费看日本二区| 久久人人精品亚洲av| 成年免费大片在线观看| 久久国产精品男人的天堂亚洲| 中出人妻视频一区二区| 亚洲精品久久成人aⅴ小说| 午夜久久久在线观看| 亚洲第一av免费看| 午夜精品久久久久久毛片777| 女性被躁到高潮视频| 男女床上黄色一级片免费看| 成年女人毛片免费观看观看9| 一级a爱视频在线免费观看| 国产一区二区激情短视频| 亚洲男人的天堂狠狠| 少妇粗大呻吟视频| 免费搜索国产男女视频| 久久国产亚洲av麻豆专区| 男女下面进入的视频免费午夜 | 看黄色毛片网站| 午夜免费观看网址| 欧美色欧美亚洲另类二区| 欧美成人午夜精品| 久久人妻福利社区极品人妻图片| 国产aⅴ精品一区二区三区波| 亚洲精品一区av在线观看| 亚洲黑人精品在线| 精品久久久久久久人妻蜜臀av| 国产欧美日韩一区二区三| 最新在线观看一区二区三区| 每晚都被弄得嗷嗷叫到高潮| 国产精华一区二区三区| 一区二区三区高清视频在线| x7x7x7水蜜桃| 国产又爽黄色视频| 久久久久精品国产欧美久久久| 免费电影在线观看免费观看| 日韩大尺度精品在线看网址| 一个人免费在线观看的高清视频| 无限看片的www在线观看| 亚洲七黄色美女视频| 在线十欧美十亚洲十日本专区| 久久精品国产亚洲av高清一级| 亚洲色图av天堂| 又大又爽又粗| 午夜视频精品福利| 国内精品久久久久久久电影| 久久国产精品男人的天堂亚洲| 欧洲精品卡2卡3卡4卡5卡区| 亚洲av美国av| 国产成人精品久久二区二区免费| 91av网站免费观看| 亚洲av五月六月丁香网| 91国产中文字幕| 国产精品综合久久久久久久免费| 成年免费大片在线观看| 国产精品香港三级国产av潘金莲| 精品第一国产精品| 视频区欧美日本亚洲| 久久久久免费精品人妻一区二区 | 午夜日韩欧美国产| 狠狠狠狠99中文字幕| 亚洲欧美精品综合一区二区三区| 成年人黄色毛片网站| ponron亚洲| 两人在一起打扑克的视频| 国产精品久久久av美女十八| 正在播放国产对白刺激| 欧美激情极品国产一区二区三区| 男女床上黄色一级片免费看| 亚洲免费av在线视频| 国产高清视频在线播放一区| 18美女黄网站色大片免费观看| 超碰成人久久| 免费高清在线观看日韩| 啦啦啦韩国在线观看视频| 香蕉av资源在线| 国产精品久久电影中文字幕| 色在线成人网| 国内精品久久久久久久电影| 日本免费一区二区三区高清不卡| 亚洲成av人片免费观看| 女同久久另类99精品国产91| 亚洲五月婷婷丁香| 日韩精品青青久久久久久| 狂野欧美激情性xxxx| 久热这里只有精品99| 亚洲在线自拍视频| 黄色视频,在线免费观看| 美女国产高潮福利片在线看| 香蕉国产在线看| 久久欧美精品欧美久久欧美| 可以免费在线观看a视频的电影网站| 婷婷六月久久综合丁香| 夜夜躁狠狠躁天天躁| 亚洲人成77777在线视频| 女性被躁到高潮视频| 俺也久久电影网| 亚洲全国av大片| 久久草成人影院| 一进一出好大好爽视频| 99国产精品99久久久久| 欧美zozozo另类| 精品国产亚洲在线| 老司机午夜十八禁免费视频| 欧美黑人精品巨大| 可以在线观看毛片的网站| 日韩欧美免费精品| 在线观看66精品国产| 国产真实乱freesex| 午夜影院日韩av| 色尼玛亚洲综合影院| 变态另类丝袜制服| 欧美性长视频在线观看| 国产色视频综合| 天天躁夜夜躁狠狠躁躁| 成人亚洲精品av一区二区| 婷婷精品国产亚洲av| 99在线视频只有这里精品首页| 国产野战对白在线观看| 级片在线观看| 国产一区二区三区在线臀色熟女| 国产精品99久久99久久久不卡| 好男人在线观看高清免费视频 | 国产人伦9x9x在线观看| 久久久久久大精品| 美女国产高潮福利片在线看| 久9热在线精品视频| 村上凉子中文字幕在线| 国产av一区在线观看免费| 国产精品九九99| 亚洲最大成人中文| 男人操女人黄网站| 美女大奶头视频| 精品第一国产精品| 国产精品av久久久久免费| 一级a爱视频在线免费观看| 欧美日韩一级在线毛片| 18禁黄网站禁片免费观看直播| 日本一本二区三区精品| 天天添夜夜摸| 在线观看免费视频日本深夜| aaaaa片日本免费| 午夜a级毛片| 一级毛片女人18水好多| 国产高清有码在线观看视频 | 色在线成人网| 久久热在线av| 免费看美女性在线毛片视频| 日韩中文字幕欧美一区二区| 国内精品久久久久久久电影| 他把我摸到了高潮在线观看| 国产亚洲精品综合一区在线观看 | 欧美大码av| 天天添夜夜摸| 亚洲三区欧美一区| 国产区一区二久久| 岛国视频午夜一区免费看| 国产亚洲精品一区二区www| 18禁裸乳无遮挡免费网站照片 | 国产精华一区二区三区| 亚洲自拍偷在线| 国产在线精品亚洲第一网站| 免费女性裸体啪啪无遮挡网站| 欧美三级亚洲精品| 老司机午夜福利在线观看视频| 99精品欧美一区二区三区四区| 中文字幕精品免费在线观看视频| 欧美成狂野欧美在线观看| 久久久国产欧美日韩av| 精品一区二区三区av网在线观看| 欧美日韩乱码在线| 999精品在线视频| 欧美一级毛片孕妇| 久久久水蜜桃国产精品网| 久久久久国产精品人妻aⅴ院| 亚洲一卡2卡3卡4卡5卡精品中文| 男人舔女人下体高潮全视频| 亚洲av中文字字幕乱码综合 | 老司机靠b影院| 精品日产1卡2卡| 亚洲精品粉嫩美女一区| 国产精品99久久99久久久不卡| 三级毛片av免费| 色综合亚洲欧美另类图片| 91在线观看av| 久久性视频一级片| 一级a爱片免费观看的视频| 怎么达到女性高潮| 国产高清有码在线观看视频 | 啦啦啦免费观看视频1| 国产99久久九九免费精品| 欧美日韩乱码在线| 12—13女人毛片做爰片一| 美女扒开内裤让男人捅视频| 成人亚洲精品一区在线观看| 国产精品亚洲一级av第二区| 女人被狂操c到高潮| 美女高潮喷水抽搐中文字幕| 亚洲国产精品合色在线| 18禁黄网站禁片午夜丰满| 在线永久观看黄色视频| 黑人操中国人逼视频| 男人的好看免费观看在线视频 | 人成视频在线观看免费观看| 国内少妇人妻偷人精品xxx网站 | 精品日产1卡2卡| 国产亚洲欧美在线一区二区| 在线观看66精品国产| 欧美黄色淫秽网站| 99在线人妻在线中文字幕| 国产黄片美女视频| 夜夜夜夜夜久久久久| 亚洲国产欧洲综合997久久, | 久久久国产成人免费| 日本 av在线| 黑人操中国人逼视频| 国产色视频综合| 国产不卡一卡二| 色播在线永久视频| 给我免费播放毛片高清在线观看| 亚洲国产看品久久| 午夜日韩欧美国产| 欧美乱妇无乱码| 国产不卡一卡二| 亚洲国产欧美网| 午夜精品在线福利| 日本 av在线| 91九色精品人成在线观看| 国产av又大| 日韩精品免费视频一区二区三区| 国产亚洲精品综合一区在线观看 | 久久青草综合色| 亚洲国产精品成人综合色| 午夜免费成人在线视频| 日本一本二区三区精品| 免费人成视频x8x8入口观看| 97碰自拍视频| 国产精品一区二区免费欧美| 色播亚洲综合网| 亚洲真实伦在线观看| 免费观看精品视频网站| 老鸭窝网址在线观看| 午夜福利18| 一级毛片女人18水好多| 18美女黄网站色大片免费观看| 天天躁狠狠躁夜夜躁狠狠躁| 真人做人爱边吃奶动态| 亚洲五月婷婷丁香| 久久精品夜夜夜夜夜久久蜜豆 | 黑人巨大精品欧美一区二区mp4|