• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A PID-incorporated Latent Factorization of Tensors Approach to Dynamically Weighted Directed Network Analysis

    2022-01-26 00:36:06HaoWuXinLuoMengChuZhouMuhyaddinRawaKhaledSedraouiandAiiadAlbeshri
    IEEE/CAA Journal of Automatica Sinica 2022年3期

    Hao Wu,Xin Luo,,MengChu Zhou,,Muhyaddin J.Rawa,,Khaled Sedraoui,and Aiiad Albeshri

    Abstract—A large-scale dynamically weighted directed network(DWDN) involving numerous entities and massive dynamic interaction is an essential data source in many big-data-related applications,like in a terminal interaction pattern analysis system(TIPAS).It can be represented by a high-dimensional and incomplete (HDI) tensor whose entries are mostly unknown.Yet such an HDI tensor contains a wealth knowledge regarding various desired patterns like potential links in a DWDN.A latent factorization-of-tensors (LFT) model proves to be highly efficient in extracting such knowledge from an HDI tensor,which is commonly achieved via a stochastic gradient descent (SGD)solver.However,an SGD-based LFT model suffers from slow convergence that impairs its efficiency on large-scale DWDNs.To address this issue,this work proposes a proportional-integralderivative (PID)-incorporated LFT model.It constructs an adjusted instance error based on the PID control principle,and then substitutes it into an SGD solver to improve the convergence rate.Empirical studies on two DWDNs generated by a real TIPAS show that compared with state-of-the-art models,the proposed model achieves significant efficiency gain as well as highly competitive prediction accuracy when handling the task of missing link prediction for a given DWDN.

    I.INTRODUCTION

    IN this era of big data,a large-scale dynamically weighted directed network (DWDN) representing dynamic interactions among massive entities is frequently encountered in various big data analysis projects [1]–[5].Its examples include a terminal interaction network [6],[7],a social network [8],[9],and a wireless sensor network [10],[11].However,owing to the huge number of involved entities,it becomes impossible to observe their full interactions,making a corresponding DWDN high-dimensional and incomplete(HDI).For instance,a terminal interaction pattern analysis system (TIPAS) concerned in this study focuses on a largescale terminal interaction network that involves 1 400 742 terminals (e.g.,personal computers,intellectual mobiles and servers) and 77 441 650 directed interactions scattering in 1158 time slots.Its dimensions come to 1 400 742 × 1 400 742 ×1158,while its data density (known data entry count over total) is about 3.41 × 10–8only.Despite its HDI nature,it contains rich knowledge describing various desired patterns like potential links and abnormal terminals [1],[2].Therefore,determining how to acquire such desired knowledge from it becomes a thorny issue.

    To date,researchers have proposed a myriad of data analysis models for extracting useful knowledge from an unweighted network.For example,a static network can be analyzed through a node similarity-based model [12],a metropolis-hasting random walk model [13],a kernel extreme learning machine-based model [14],and a random walk-based deepwalk model [15].When a target network is dynamic,its knowledge acquisition can be implemented through a kernel similarity-based model [16],a Louvain-based dynamic community detection model [17],a spectral graph wavelet model [18],and a dynamic node embedding model [19].The above mentioned models are able to perform knowledge discovery given an unweighted network.However,they are incompatible with DWDN with directed and weighted links.

    On the other hand,an HDI tensor is able to fully preserve a DWDN’s structural,spatial and temporal patterns [20],[21],making it possess inherent advantages in describing DWDN.Therefore,a latent factorization-of-tensors (LFT)-based approach to DWDN representation learning has become popular in recent years [22]–[26].Representative models of this kind include a regularized tensor decomposition model[22],a Candecomp/Parafac (CP) weighted optimization model[23],a biased non-negative LFT model [24],a fused CP factorization model [25],and a non-negative CP factorization model [26].

    When building an LFT-based model,stochastic gradient descent (SGD) is mostly adopted as the learning algorithm to build desired latent features (LFs) [27]–[29].However,an SGD-based LFT model updates an LF depending on the stochastic gradient defined on the instant loss only,without considering historical update information.Consequently,it takes many iterations to convergence,thus resulting in low computational efficiency given large-scale DWDNs from industrial applications [30]–[32].For instance,considering the aforementioned DWDN generated by a real TIAPS,an SGDbased LFT model needs to traverse 77441650 entries in one iteration,which is computationally expensive.

    A proportional-integral-derivative (PID) controller utilizes the past,current and future information of prediction error to control an automated system [33],[34].It has been widely adopted in various control-related applications [35],[36],e.g.,a digital fractional order PID controller for a magnetic levitation system [37],a PID passivity-based controller for the regulation of port-Hamiltonian systems [38],a cascaded PID controller for automatic generation control analysis [39],and a nonlinear self-tuning PID controller for position control of a flexible manipulator [40].However,a PID-based method is rarely adopted in HDI tensor representation,which is frequently encountered in many big data-related applications[31],[32].

    Motivated by the above discovery,the following research question is proposed:

    Research Question:Is it possible to incorporate the PID principle into an SGD-based LFT model,thereby improving its performance on an HDI tensor?

    To answer it,this work innovatively proposes a PIDincorporated LFT (PLFT) model,which seamlessly integrates the PID control principle into an SGD solver.A PLFT model’s main idea is to construct an adjusted instance error by using a PID controller,and then adopt this adjusted error to perform an SGD-based learning process,thereby improving the convergence rate.This work aims to make the following contributions:

    1) A PLFT model.It performs latent feature analysis on an HDI tensor with high efficiency and preciseness;

    2) Detailed algorithm design and analysis for PLFT.It provides specific guidance for researchers to implement PLFT for analyzing DWDNs.

    Empirical studies on two large-scale DWDNs from a real TIPAS demonstrate that compared with state-of-the-art models,PLFT achieves impressively high efficiency as well as highly competitive prediction accuracy for link prediction.

    The remainder of this paper is organized as follows.Section II provides the preliminaries.Section III provides a detailed description of PLFT.Section IV reports experimental results.Section V discusses related work.Finally,Section VI concludes this paper.

    II.PRELIMINARIES

    A.Symbols Appointment

    Adopted symbols in this paper are summarized in Table I.

    TABLE IADOPTED SYMBOLS AND THEIR DESCRIPTIONS

    B.A DWDN and an HDI Tensor

    Fig.1 depicts a small example illustrating the connections between a DWDN and an HDI tensor.Consider the dynamic states of a DWDN during |K| time slots.The net can be described by a snapshot sequence where each one describes its state observed during a specified time slot.

    Definition 1 (DWDN State):Gk= (V,Lk) is thekth state of a DWDN whereLkis the weighted directed link set observed at time slotk∈K= {1,2,...,|K|}.

    An HDI tensor can accurately describe a DWDN [24],[25].Formally,it is defined as follows.

    Definition 2 (HDI Tensor):Given entity setsI,J,andK,letY|I|×|J|×|K|∈ R be a tensor where each entryyijkrepresents certain relationship among entitiesi∈I,j∈Jandk∈K.GivenY’s known and unknown entry sets Λ and Γ,Yis an HDI tensor if |Λ| ? |Γ|.

    Fig.1.A DWDN and an HDI tensor Y.

    To obtain three LF matricesS,DandT,we build an objective functionεto measure the difference betweenYand,which relies on the commonly-used Euclidean distance[21]–[25]

    As shown in Fig.2,Yis an HDI tensor with only a few known entries.Hence,we defineεon Λ only to representY’s known entries precisely [20],[24],[28],[29] to reformulate(3) as

    Fig.2.Latent factorization of an HDI tensor Y.

    According to prior research [24]–[26],due to the imbalanced distribution ofY’s known entries,it is essential to incorporate Tikhonov regularization into (4) to avoid overfitting

    With SGD,the learning scheme for desired LFs is given as

    whereeijk=denotes the instant error onyijk.With (6),an SGD-based LFT model can extract latent patterns form an HDI tensor.For instance,Fig.3 depicts the process of performing missing link prediction forGthrough an LFT model.

    D.A PID Controller

    A PID controller adjusts the observed error based on proportional,integral,and derivative terms [33]–[36].Fig.4 depicts the flowchart of a discrete PID controller.Note that this paper adopts this base controller because it is directly compatible with an SGD-based LFT model’s learning scheme as shown later.As shown in Fig.4,at themth time point,a discrete PID controller builds the adjusted erroras follows:

    whereEmandEiare the observed error at themandith time points,CP,CIandCDare the coefficients for controlling the proportional,integral and derivative effects,respectively.With the adjusted error,the system considers prior errors to implement precise control.Next,we integrate the PID control principle into SGD-based LFT to achieve a PLFT model.

    III.PLFT MODEL

    A.Model

    According to [41]–[43],an LFT model’s stability and representation learning ability can be enhanced by incorporating linear bias (LB) into its learning objective.According to [24],for a three-way HDI tensorY|I|×|J|×|K|,the LBs can be modeled through three LB vectorsa,b,andcwith length |I|,|J|,and |K|.By introducing them into (5),we arrive at the biased learning objective given as

    With (8),consists ofRrank-one tensors {Xr|r∈ {1,…,R}} built on LF matricesS,DandT,and three rank-one bias tensorsH1–H3built on LB matricesA,BandC.The LB vectorsa,bandcare respectively set as the first,second and third columns ofA,BandC,and other columns are set as onevalued column vectors,which is similar to LBs involved in a matrix factorization model [42],[43].This design is shown in Fig.5.

    Fig.3.Missing link prediction by performing LFT on an HDI tensor Y.

    Fig.4.Flow chart of a discrete PID controller.

    As given in (5),with SGD,the instant error of an instanceyijkat thenth training iteration is.Following(7),such instant error can be regarded as a PID controller’s proportional term.Moreover,the information ofyijkis passed into a parameter update througheijkonly in a training iteration.Therefore,this process can be described by a simplified PID controller by settingCP= 1 andCI=CD= 0.

    Fig.5.Building HDI tensor Y’s biased approximation tensor .

    With (10)–(15) a PLFT model is achieved.Its flow is in Fig.6.

    B.Algorithm Design and Analysis

    Based on the above inferences,we design Algorithm PLFT.According to PLFT,the main task in each iteration is to update LF matrices and LB vectors.Hence,its computation cost is

    Note that (16) adopts the condition that |Λ| ? max{|I|,|J|,|K|} which is constantly fulfilled in real applications.SincenandRare positive constants in practice,a PLFT model’s computation cost is linear with an HDI tensor’s known entry count.

    Fig.6.Flowchart of PLFT.

    Its storage cost mainly depends on the following three factors: 1) arrays caching the input known entries Λ and the corresponding estimations; 2) auxiliary arrays caching the integral and derivative terms on each instance; and 3) LF matricesS,DandT,and linear vectorsa,bandc.Hence,the PLFT’s storage cost is given as

    which is linear with the target HDI tensor’s known entry count and its LF count.The next section conducts experiments to validate a PLFT model’s performance.

    IV.EXPERIMENTAL RESULTS AND ANALYSES

    A.General Settings

    Datasets:The experiments are conducted on two large-scale DWDNs labeled asD1 andD2 from a real TIPAS that concerns the dynamic interactions over two million terminals.Table II summarizes their details.On them,we build two different HDI tensors,e.g.,the HDI tensor corresponding toD1 has the size of 1 400 742 × 1 400 742 × 1158 and data density of 8.41 × 10–8only.

    TABLE IIDATASET DETAILS

    To achieve objective results,each dataset is randomly split into the disjoint training setK,validation set Ψ,and testing set Ω at the ratio of 7:1:2.A tested model is built onK,validated on Ψ,and tested on Ω to obtain the final outcomes.The training process of a model terminates if 1) the number of consumed iterations reaches a preset threshold,e.g.,500,or 2)it converges,i.e.,the error difference between two consecutive iterations is less than 10–6.Note that for all involved models,the LF space dimensionRis fixed at 20 uniformly to balance their computational cost and representation learning ability,following the settings in [20]–[25].

    Evaluation Metrics:We choose weighted missing link prediction as the evaluation protocol due to the great need of building full interaction mapping among involved entities in real systems [20]–[25].The root mean squared error (RMSE)and mean absolute error (MAE) are the evaluation metrics

    Note that for a tested model,a small RMSE and MAE denote high prediction accuracy for missing links in a DWDN.

    B.Parameters Sensitivity Test

    As shown in Section III,PLFT’s performance is affected by learning rateη,regularization coefficientλ,and PID constantsCP,CI,andCD.Thus,we start with parameter sensitivity tests.

    1) Effects of η and λ

    Fig.7.Impacts of η and λ.

    This set of experiments fixes the PID coefficients and makesηincrease from 0.001 to 0.006 andλfrom 0.05 to 0.25.Figs.7(a) and 7(c) records PLFT’s RMSE and MAE asηand λ changes.Figs.7(b) and 7(d) depicts the corresponding iteration count.From these results,we have the following findings:

    a) PLFT’s convergence is related with η and λ:As shown in Figs.7(b) and 7(d),the converging iteration counts of PLFT in RMSE and MAE decrease asηincreases while they increase asλincreases onD1 andD2.For instance,as shown in Fig.7(b),asηincreases from 0.001 to 0.006 onD2,the converging iteration count of PLFT decreases from 21 to 5 in RMSE,and from 20 to 5 in MAE,respectively.This results is consistent with the role ofηin PLFT,i.e.,the larger the learning rate,the faster the parameter update velocity,resulting in fewer iterations for convergence.However,we also note that owing to its incorporation of PID,PLFT’s convergence rate stays consistently high even if the learning rate is set to be small.On the other hand,as shown in Fig.7(d),PLFT’s converging iteration count is generally linear withλ.For instance,asλincreases from 0.05 to 0.25 onD2,its converging iteration count increases from 21 to 24 in RMSE,and from 20 to 26 in MAE.This is because whenλgrows larger,the parameter update process requires more fitting regularization term,which leads to more iterations.

    b) PLFT’s prediction error increases as η and λ increase:For instance,as depicted in Fig.7(a),PLFT’s RMSE is 0.2954 and 0.3004 asη= 0.001 and 0.006 onD1,where the increment ((RMSEhigh–RMSElow)/RMSEhigh) is 1.66%.This result indicates that sinceηgrows larger,the parameter update process overshoots an optimum.As shown in Fig.7(c),PLFT’s RMSE is 0.2954 and 0.2961 asλincreases from 0.05 to 0.25 onD1,where the increment is 0.24%.This phenomenon demonstrates the occurrence of underfitting asλincreases continually.Note that similar situations are seen onD2,as shown in Figs.7(a) and 7(c).In general,the PLFT’s prediction error increases more rapidly inηthan inλ.

    2) Effects of CP,CI,andCD

    This set of experiments fixesηandλ,and makes the PID coefficients increase to test their effects on PLFT’s performance.The experimental results are illustrated in Fig.8.From them,we have the following findings:

    a) PLFT’s converging iteration count relies on CP,CI,and CD:As shown in Figs.8(b) and 8(d),PLFT’s converging iteration count in RMSE and MAE decreases asCPandCIincrease.However,it increases asCDincreases as shown in Fig.8(f).For instance,as shown in Fig.8(b),onD1,PLFT’s converging iteration count in RMSE is 57 withCP= 1.WhenCPincreases to 5,it decreases to 23.Likewise,asCIincreases from 0.01 to 0.09,it decreases from 34 to 15.In contrast,PLFT’s converging iteration count in RMSE increases from 24 to 43 asCDincreases from 10 to 90.Similar phenomena are encountered onD2 as shown in Figs.8(b),8(d) and 8(f).

    b) CP and CI affect PLFT’s prediction error slightly:As shown in Fig.8(a) and 8(c),onD1 andD2,PLFT’s RMSE and MAE increase slightly withCPandCI,while they are generally steady asCDincreases.For instance,onD1,as shown in Fig.8(a),PLFT’s MAE increases from 0.2180 to 0.2196 asCPincreases from 1 to 5,where the increment is 0.73%.According to Fig.8(c),PLFT’s MAE is 0.2184 withCI= 0.01 and 0.2193 withCI= 0.09,where the increment is 0.42%.In comparison,as shown in Fig.8(e),whenCDincreases from 10 to 90,the difference in PLFT’s MAE is 0.0002 only.Similar results are obtained onD2,as shown in Figs.8(a),8(c) and 8(e).

    Fig.8.Impacts of CP,CI and CD.

    Note that the above results are consistent with the theoretical analysis in Section III-A.CPcontrols the proportional term,and actually plays the role of the learning rate in the current update.CIcontrols the adjustment of a learning direction.Accordingly,a largerCIhas a stronger restriction on the oscillation of a parameter update process,which can make PLFT converge faster but also cause slight accuracy loss.CDcontrols the future expectations of a parameter update process.CDin an appropriate range can effectively avoid overshooting.Moreover,a relatively smallCDcan make PLFT converge faster to an optimum.

    3) Summary

    Based on the above results,we summarize that PLFT’s convergence rate is sensitive in its hyper-parameters,while its prediction accuracy is less sensitive in them.However,to further increase a PLFT model’s practicability in addressing DWDN analysis problems,a hyper-parameter self-adaptive scheme is desired.We plan to address it as a future issue.

    C.Comparison With State-of-the-art Models

    Three state-of-the-art models with the ability of analyzing an HDI tensor are compared with the proposed model.Their details are as follows:

    a) M1:A time-SVD++ model [44].It models the temporal effects into the objective function of a latent factor model such that the model can well represent an HDI tensor.

    TABLE IIIHYPER-PARAMETERS SETTINGS OF M1-M4

    Fig.9.Training curves of M1–M4 on D1 and D2.

    b) M2:A PARAFAC decomposition-based model [45].It builds anl1norm-based regularization scheme,and adopts nonnegative alternating least squares as its learning scheme.

    c) M3:An SGD-based LFT model [29].It adopts a standard SGD to perform parameters learning.

    d) M4:A PLFT model proposed in this paper.

    Note that hyper-parameter settings of M1–M4 are given in Table III.Considering M1–M4,we perform a grid search for their related hyper-parameters on one set of experiments out of 20 independent runs in total to achieve their best performance,and then adopt the same settings on the remaining 19 sets of experiments.

    Fig.9 depicts M1–M4’ training curves.Fig.10 records their total time costs.Fig.11 depicts their RMSE and MAE.From these results,we have the following important findings:

    a) M4 converges much faster than its peers do:As shown in Fig.9(a),M4 takes 23 iterations only to converge in RMSE on D1.In comparison,M1–M3 takes 64,26 and 67 iterations to converge in RMSE onD1,respectively.Considering MAE,as shown in Fig.9(b),M1–M3 respectively requires 68,26 and 60 iterations to converge onD1,while M4 only consumes 21 iterations.Similar phenomena can be found onD2,as depicted in Figs.9(c) and 9(d).From these results,we can obviously see that by incorporating a PID control principle,PLFT’s convergence rate is improved significantly over an SGD-LFT model.It turns out that incorporating the PID control principle into an SGD-based LFT model is feasible and highly effective.

    b) M4 achieves higher computational efficiency than its peers do:For instance,as shown in 10(a),onD1,M4 takes about 10 min to converge in RMSE,which is 8.13% of 123 min by M1,7.63% of 131 min by M2,and 40% of 25 min by M3.Considering MAE,M1 takes about 9 min to converge,which is 6.87% of 131 min by M1,6.62% of 136 min by M2,and 40.90% of 22 min by M3.Similar outputs are obtained onD2,as in Fig.10(b).As summarized in Algorithm 1,the computation cost of a PLFT model mainly relies on an HDI tensor’s known entry set.From the above experimental results,we evidently see that such design is extremely suitable and necessary when analyzing a large-scale HDI DWDN.

    Fig.10.Total time cost (min) of M1–M4 on D1 and D2.

    Fig.11.The RMSE and MAE of M1–M4 on D1 and D2.

    c) M4 has highly competitive prediction accuracy for missing data of an HDI tensor when compared with its peers:As depicted in Fig.11(a),onD1,M4’s RMSE is 0.2954,which is about 14.10% lower than the 0.3439 RMSE obtained by M1 and 4.15% lower than the 0.3082 RMSE obtained by M2,but 0.14% higher than the 0.2950 RMSE obtained by M3.Likewise,onD2,M1–M4 respectively achieve RMSEs of 0.3645,0.3223,0.3096 and 0.3103,where the RMSE of M4 is 14.87% and 3.72% lower than that of M1 and M2 while 0.23% higher than that of M3.When measured by MAE,similar results are achieved on both datasets,as shown in Fig.11(b).Note that M4 suffers from a small accuracy loss when compared with M3.Yet its computational efficiency is much higher than that of M3.This phenomenon indicates the performance of a PLFT model can still be further improved by incorporating another advanced controller,e.g.,a nonlinear PID controller.We plan to address this issue in the next work.

    D.Summary

    According to the experimental results,we conclude that PLFT is equipped with: a) a high convergence rate,b)impressively high computational efficiency,and c) highly competitive prediction accuracy.Therefore,PLFT is a highly effective model to large-scale DWDNs analysis.

    V.RELATED WORK

    A network has become increasingly important to model a complex system consisting of numerous entities and interactions.Network data analysis is widely involved in many fields,where missing link prediction is one of the essential and vital tasks in network analysis [1]–[5].To date,many sophisticated models to missing link prediction in a given network are proposed.They can be roughly divided into the following three categories:

    1)LF Analysis-based Models

    Duanet al.[46] propose an ensemble approach,which decomposes traditional link prediction into sub-problems of smaller size and each sub-problem is solved with LF analysis models.Nguyenet al.[47] utilize LF kernels and cast the link prediction problem into a binary classification to implement link prediction on sparse graphs.Zhuet al.[48] propose a temporal latent space model for dynamic link prediction,where the dynamic link over time is predicted by a sequence of previous graph snapshots.Yeet al.[49] use nonnegative matrix tri-factorization to construct network latent topological features,which acquires the principal common factors of the network link structures.Wanget al.[50] present a fusion model to fuse an adjacent matrix and key topological metrics into a unified probability LF analysis-based model for link prediction,which considers both the symmetric and asymmetric metrics.

    2)Neural Networks-based Models

    Wanget al.[51] design a Bayesian-based relational network model that combines network structures and node attributes to predict missing links.Chenet al.[52] propose a dynamic link prediction model by combining long short-term memory and encoder-decoder mechanisms.Liet al.[53] develop a temporal-restricted Boltzmann machine-based deep network,which predicts missing links by utilizing individual transition variances and influence of local neighbors.Ozcanet al.[54]propose a multivariate link prediction model for evolving heterogeneous networks via a nonlinear autoregressive neural network.Liet al.[55] model the evolution pattern of each link in a network by incorporating a gradient boosting decision tree and temporal restricted Boltzmann machine.

    3)Network Analysis-based Models

    Groveret al.[56] map involved nodes into low-dimensional feature space and adopts a biased random walk method to implement link prediction.Fuet al.[57] utilize line graph indices along with original graph indices to implement link weight prediction.Wanget al.[58] develop a cold-start link prediction approach by establishing the connections between topological and non-topological information.Xiaet al.[59]design a model based on a random walk with restart,which can determine potential collaborators in academic social networks.Deet al.[60] design a novel framework by using a stacked two-level learning paradigm for link prediction,which integrates local,community,and global signals.

    The above mentioned approaches can efficiently perform missing link prediction on a specified network.However,they all fail to address DWDN with directed and weighted missing links.Moreover,they commonly suffer from high computational cost due to their model structure or learning scheme.In comparison,the proposed PLFT model describe a DWDN via an HDI tensor smoothly,and achieves high efficiency via incorporating the PID principle into its learning scheme.

    VI.CONCLUSIONS

    In order to improve an SGD-based LFT model’s efficiency,this work proposes a PLFT model that combines the information of past,current and future updates into its parameters update following the PID control principle,which is never seen in existing studies to our best knowledge.It exhibits a higher convergence rate and greater computational efficiency than state-of-the-art methods,and very competitive accuracy in missing link prediction in comparison with them.Hence,it provides highly effective solutions for large-scale DWDN analysis.However,the following issues remain open:

    1) The performance of a PLFT model is affected by its hyper-parameters.Thus,making them adaptable to various applications is of great importance [61],[62].

    2) A nonlinear PID [40],[63] has shown excellent performance in real applications.Thus,we may consider further performance gain by incorporating its principle into an SGD-based LFT model.

    3) Determining how to further improve PLFT’s computational efficiency via a parallel computation framework [64]–[65],and apply PLFT to other datasets [66]–[69].We plan to address them as our future work.

    在线天堂中文资源库| 考比视频在线观看| 精品久久久久久电影网| 19禁男女啪啪无遮挡网站| 精品视频人人做人人爽| 亚洲精品第二区| 亚洲精品第二区| 欧美日韩亚洲高清精品| 日韩 亚洲 欧美在线| 日韩熟女老妇一区二区性免费视频| 女人久久www免费人成看片| 日本91视频免费播放| xxxhd国产人妻xxx| 久久性视频一级片| 欧美日韩亚洲高清精品| 美女扒开内裤让男人捅视频| 亚洲一区二区三区欧美精品| 女的被弄到高潮叫床怎么办| 久久久久国产一级毛片高清牌| 亚洲在久久综合| 青青草视频在线视频观看| 亚洲四区av| 一级爰片在线观看| 久久久久久免费高清国产稀缺| 女的被弄到高潮叫床怎么办| 丁香六月天网| 国产淫语在线视频| 欧美黑人精品巨大| 欧美 日韩 精品 国产| 午夜福利影视在线免费观看| 熟女av电影| 欧美中文综合在线视频| 丁香六月天网| 日本欧美视频一区| 九九爱精品视频在线观看| 天天影视国产精品| 日本黄色日本黄色录像| 你懂的网址亚洲精品在线观看| 久久久亚洲精品成人影院| 亚洲美女视频黄频| 夫妻午夜视频| 男男h啪啪无遮挡| 热re99久久国产66热| 亚洲 欧美一区二区三区| 国产精品一区二区在线不卡| 少妇的丰满在线观看| 国产成人欧美在线观看 | 亚洲欧美成人综合另类久久久| 久久性视频一级片| 视频区图区小说| 久久久久国产一级毛片高清牌| 国产97色在线日韩免费| 啦啦啦在线观看免费高清www| 久久久久精品国产欧美久久久 | 精品国产国语对白av| av女优亚洲男人天堂| 国产无遮挡羞羞视频在线观看| 欧美日韩亚洲高清精品| 丝袜喷水一区| 国语对白做爰xxxⅹ性视频网站| 欧美日韩福利视频一区二区| 中文精品一卡2卡3卡4更新| 国产精品免费大片| www.熟女人妻精品国产| 欧美中文综合在线视频| 精品久久久久久电影网| 亚洲精品乱久久久久久| 一区在线观看完整版| 亚洲伊人久久精品综合| 夫妻性生交免费视频一级片| 午夜福利免费观看在线| 亚洲第一区二区三区不卡| 久久天躁狠狠躁夜夜2o2o | 亚洲精品一区蜜桃| 99热全是精品| 青春草国产在线视频| 国产爽快片一区二区三区| 亚洲国产毛片av蜜桃av| 亚洲,欧美精品.| 久久久国产精品麻豆| 男女边摸边吃奶| 日韩一卡2卡3卡4卡2021年| 丰满迷人的少妇在线观看| 亚洲欧美中文字幕日韩二区| 超碰97精品在线观看| 精品人妻一区二区三区麻豆| 嫩草影视91久久| 老熟女久久久| 欧美av亚洲av综合av国产av | 日本vs欧美在线观看视频| 国产极品天堂在线| 天天添夜夜摸| 欧美日韩成人在线一区二区| 免费av中文字幕在线| 久久99热这里只频精品6学生| 欧美日韩精品网址| 男女无遮挡免费网站观看| 成人亚洲精品一区在线观看| 免费高清在线观看视频在线观看| 精品久久蜜臀av无| 在线观看免费高清a一片| 亚洲熟女毛片儿| 久热这里只有精品99| 最近最新中文字幕大全免费视频 | 狠狠婷婷综合久久久久久88av| 免费人妻精品一区二区三区视频| 中文欧美无线码| 久久久久久久国产电影| 午夜福利影视在线免费观看| 老司机在亚洲福利影院| 水蜜桃什么品种好| 妹子高潮喷水视频| 中文字幕亚洲精品专区| 欧美亚洲日本最大视频资源| 波多野结衣av一区二区av| 国产日韩欧美在线精品| 久热这里只有精品99| 午夜免费观看性视频| 亚洲久久久国产精品| 亚洲中文av在线| 国产精品久久久久久人妻精品电影 | 超碰97精品在线观看| 欧美在线黄色| 国产精品一区二区在线观看99| 母亲3免费完整高清在线观看| 精品国产一区二区久久| 国产视频首页在线观看| av电影中文网址| www.自偷自拍.com| 亚洲久久久国产精品| 1024视频免费在线观看| 啦啦啦在线观看免费高清www| 性少妇av在线| 国产精品久久久人人做人人爽| 老司机影院毛片| 婷婷色综合大香蕉| 亚洲国产精品一区三区| 不卡av一区二区三区| 丝袜在线中文字幕| 麻豆乱淫一区二区| 嫩草影院入口| 超碰成人久久| 亚洲精品中文字幕在线视频| 国产日韩欧美在线精品| 伦理电影大哥的女人| 日韩人妻精品一区2区三区| 午夜老司机福利片| 久久久久人妻精品一区果冻| 伊人久久大香线蕉亚洲五| 免费观看人在逋| 又粗又硬又长又爽又黄的视频| 国产黄色视频一区二区在线观看| 国产探花极品一区二区| 色婷婷av一区二区三区视频| 久热爱精品视频在线9| 在线精品无人区一区二区三| 可以免费在线观看a视频的电影网站 | 精品亚洲成国产av| 18在线观看网站| 天美传媒精品一区二区| 99久久人妻综合| 国产av码专区亚洲av| 成人国产麻豆网| 亚洲欧洲精品一区二区精品久久久 | 国产精品女同一区二区软件| 中文字幕高清在线视频| 亚洲精品国产区一区二| 午夜福利乱码中文字幕| 亚洲成人av在线免费| 日韩精品有码人妻一区| 人人妻人人添人人爽欧美一区卜| 美国免费a级毛片| 夫妻性生交免费视频一级片| 久久精品亚洲av国产电影网| 日韩大片免费观看网站| 黑人猛操日本美女一级片| 国产免费一区二区三区四区乱码| 美女视频免费永久观看网站| 九草在线视频观看| 国产有黄有色有爽视频| 1024视频免费在线观看| 69精品国产乱码久久久| 日韩制服骚丝袜av| 色吧在线观看| 国产精品一区二区在线观看99| 在线看a的网站| 超色免费av| 欧美精品av麻豆av| av卡一久久| 国产日韩欧美视频二区| 99久久人妻综合| 秋霞伦理黄片| 久久久久精品人妻al黑| 老鸭窝网址在线观看| bbb黄色大片| 美女扒开内裤让男人捅视频| 欧美日韩国产mv在线观看视频| av女优亚洲男人天堂| 精品亚洲成a人片在线观看| a级片在线免费高清观看视频| 满18在线观看网站| 日日撸夜夜添| 一区二区三区四区激情视频| 国产无遮挡羞羞视频在线观看| 亚洲第一青青草原| 日韩大码丰满熟妇| av又黄又爽大尺度在线免费看| 久久久久久久精品精品| 天天躁日日躁夜夜躁夜夜| 丝袜脚勾引网站| 亚洲国产毛片av蜜桃av| 亚洲国产精品一区二区三区在线| 中文字幕色久视频| 精品少妇内射三级| 男人舔女人的私密视频| 夫妻性生交免费视频一级片| 国产极品粉嫩免费观看在线| av在线播放精品| 精品少妇一区二区三区视频日本电影 | 国产成人精品久久二区二区91 | 久久热在线av| 国语对白做爰xxxⅹ性视频网站| 亚洲激情五月婷婷啪啪| 日本av免费视频播放| 十分钟在线观看高清视频www| 精品第一国产精品| 在线 av 中文字幕| 日本欧美视频一区| 一级毛片黄色毛片免费观看视频| 成人国产av品久久久| 国产欧美日韩综合在线一区二区| 欧美国产精品一级二级三级| 在线 av 中文字幕| 亚洲欧美精品综合一区二区三区| 啦啦啦啦在线视频资源| 久久人人97超碰香蕉20202| 精品一区二区三卡| 最近最新中文字幕大全免费视频 | 亚洲,一卡二卡三卡| 夜夜骑夜夜射夜夜干| 99九九在线精品视频| 高清在线视频一区二区三区| 亚洲av日韩在线播放| 成人国产av品久久久| 18禁观看日本| 精品一品国产午夜福利视频| 亚洲av日韩在线播放| 看非洲黑人一级黄片| 丰满迷人的少妇在线观看| 久久久久精品性色| 赤兔流量卡办理| 亚洲精品美女久久av网站| 汤姆久久久久久久影院中文字幕| av一本久久久久| 亚洲精品第二区| 午夜福利,免费看| 久久久久精品久久久久真实原创| 91老司机精品| 9热在线视频观看99| 亚洲精华国产精华液的使用体验| 免费人妻精品一区二区三区视频| 久久 成人 亚洲| 晚上一个人看的免费电影| 人人妻人人澡人人爽人人夜夜| 亚洲国产欧美在线一区| 卡戴珊不雅视频在线播放| 少妇人妻精品综合一区二区| 91成人精品电影| 伊人亚洲综合成人网| 欧美中文综合在线视频| 国产无遮挡羞羞视频在线观看| 午夜福利,免费看| 观看av在线不卡| 欧美少妇被猛烈插入视频| 日韩制服丝袜自拍偷拍| 亚洲精品美女久久久久99蜜臀 | 欧美日韩视频高清一区二区三区二| 亚洲精品成人av观看孕妇| 久久久久精品人妻al黑| 七月丁香在线播放| 亚洲四区av| 免费在线观看完整版高清| 日韩大片免费观看网站| 欧美成人午夜精品| 亚洲中文av在线| 国产亚洲欧美精品永久| 侵犯人妻中文字幕一二三四区| 久久鲁丝午夜福利片| 精品一区二区三区四区五区乱码 | 在线观看免费高清a一片| 午夜福利视频在线观看免费| 国产极品天堂在线| 精品久久久精品久久久| 午夜免费观看性视频| 日本一区二区免费在线视频| 丁香六月欧美| 久久婷婷青草| 观看av在线不卡| 五月天丁香电影| 人妻一区二区av| 日韩中文字幕视频在线看片| 久久精品久久久久久久性| 亚洲中文av在线| 中文精品一卡2卡3卡4更新| 日韩一区二区三区影片| 亚洲人成网站在线观看播放| 亚洲av日韩在线播放| www.精华液| 侵犯人妻中文字幕一二三四区| 黄色一级大片看看| 国产精品欧美亚洲77777| 久久久久久人人人人人| 欧美精品一区二区大全| 在线观看免费高清a一片| 午夜福利乱码中文字幕| 欧美精品一区二区免费开放| 咕卡用的链子| kizo精华| av在线观看视频网站免费| 国产免费一区二区三区四区乱码| 两个人免费观看高清视频| 青草久久国产| 亚洲欧美一区二区三区久久| 亚洲人成网站在线观看播放| 国产精品秋霞免费鲁丝片| 老司机影院成人| 精品少妇黑人巨大在线播放| 99香蕉大伊视频| 一本大道久久a久久精品| 久久亚洲国产成人精品v| 又大又黄又爽视频免费| 夜夜骑夜夜射夜夜干| 亚洲欧美成人精品一区二区| 国产极品粉嫩免费观看在线| www日本在线高清视频| kizo精华| 亚洲免费av在线视频| 亚洲精品国产区一区二| 国产男女超爽视频在线观看| 欧美日韩亚洲国产一区二区在线观看 | 亚洲人成网站在线观看播放| 国产成人91sexporn| 999精品在线视频| 男的添女的下面高潮视频| 黄片播放在线免费| 97在线人人人人妻| 久久精品亚洲av国产电影网| 超碰成人久久| 99国产精品免费福利视频| 99热网站在线观看| 久久久久久免费高清国产稀缺| 亚洲欧美精品自产自拍| 青春草亚洲视频在线观看| 女人高潮潮喷娇喘18禁视频| 亚洲激情五月婷婷啪啪| 欧美日韩国产mv在线观看视频| 欧美av亚洲av综合av国产av | 我的亚洲天堂| 亚洲少妇的诱惑av| 香蕉国产在线看| 美国免费a级毛片| 久久精品aⅴ一区二区三区四区| 母亲3免费完整高清在线观看| 亚洲,一卡二卡三卡| 日韩视频在线欧美| 这个男人来自地球电影免费观看 | 夫妻性生交免费视频一级片| 精品午夜福利在线看| 亚洲图色成人| 大香蕉久久成人网| 亚洲国产最新在线播放| 久久婷婷青草| 最近最新中文字幕免费大全7| 黄网站色视频无遮挡免费观看| 人体艺术视频欧美日本| 久久99热这里只频精品6学生| 悠悠久久av| 男人添女人高潮全过程视频| av免费观看日本| 亚洲国产最新在线播放| 亚洲,欧美,日韩| 国产成人精品久久久久久| 国产亚洲欧美精品永久| 亚洲精品国产av成人精品| 国产欧美日韩一区二区三区在线| 中文字幕人妻丝袜制服| 黑人欧美特级aaaaaa片| 亚洲欧美色中文字幕在线| 亚洲激情五月婷婷啪啪| 久久精品久久精品一区二区三区| 国产日韩欧美视频二区| 国产精品二区激情视频| 亚洲国产av影院在线观看| 啦啦啦视频在线资源免费观看| 黄网站色视频无遮挡免费观看| 99国产精品免费福利视频| 成人国产av品久久久| 超色免费av| 观看美女的网站| 在线观看www视频免费| 国产午夜精品一二区理论片| 亚洲美女黄色视频免费看| 成人影院久久| 国产亚洲一区二区精品| 国产免费现黄频在线看| 国产成人91sexporn| 亚洲五月色婷婷综合| 亚洲欧美成人精品一区二区| 成人手机av| 嫩草影院入口| 国产xxxxx性猛交| 91aial.com中文字幕在线观看| 久久综合国产亚洲精品| 韩国高清视频一区二区三区| 国产精品国产av在线观看| 欧美精品人与动牲交sv欧美| 亚洲精品乱久久久久久| 欧美日韩国产mv在线观看视频| 亚洲国产毛片av蜜桃av| 国产免费福利视频在线观看| 国产成人啪精品午夜网站| 飞空精品影院首页| 高清黄色对白视频在线免费看| 欧美另类一区| 国产无遮挡羞羞视频在线观看| 日韩,欧美,国产一区二区三区| 最新在线观看一区二区三区 | 亚洲一码二码三码区别大吗| 校园人妻丝袜中文字幕| av在线老鸭窝| 国产女主播在线喷水免费视频网站| www.精华液| 9热在线视频观看99| 黄片小视频在线播放| 国产精品秋霞免费鲁丝片| 欧美国产精品va在线观看不卡| a级片在线免费高清观看视频| 日韩精品有码人妻一区| 一本大道久久a久久精品| 在线观看免费视频网站a站| 国产欧美日韩综合在线一区二区| 天堂8中文在线网| 新久久久久国产一级毛片| 日韩欧美一区视频在线观看| 午夜福利免费观看在线| 人人妻人人澡人人看| www.熟女人妻精品国产| 在线看a的网站| 精品一区二区免费观看| 久久97久久精品| 亚洲成人一二三区av| 欧美变态另类bdsm刘玥| 国产在线视频一区二区| 在线观看免费日韩欧美大片| 最新在线观看一区二区三区 | 五月开心婷婷网| 亚洲人成电影观看| 欧美乱码精品一区二区三区| 国产伦理片在线播放av一区| 日本vs欧美在线观看视频| 亚洲熟女毛片儿| 久久鲁丝午夜福利片| 色综合欧美亚洲国产小说| 爱豆传媒免费全集在线观看| 亚洲四区av| 尾随美女入室| 18禁裸乳无遮挡动漫免费视频| 国产亚洲一区二区精品| a级片在线免费高清观看视频| 日韩av不卡免费在线播放| 在线观看免费高清a一片| 国产野战对白在线观看| 哪个播放器可以免费观看大片| 日韩欧美精品免费久久| 丝袜喷水一区| 天天操日日干夜夜撸| 精品第一国产精品| 精品亚洲成国产av| 美女大奶头黄色视频| 亚洲色图综合在线观看| videos熟女内射| 亚洲国产欧美一区二区综合| 国产99久久九九免费精品| 午夜影院在线不卡| 一级,二级,三级黄色视频| 最近手机中文字幕大全| 欧美变态另类bdsm刘玥| 又大又黄又爽视频免费| 狂野欧美激情性bbbbbb| 亚洲精品美女久久av网站| www.精华液| 丁香六月天网| 久久免费观看电影| 久久国产亚洲av麻豆专区| 少妇精品久久久久久久| 亚洲一卡2卡3卡4卡5卡精品中文| 最近手机中文字幕大全| 欧美老熟妇乱子伦牲交| 色吧在线观看| 悠悠久久av| 人成视频在线观看免费观看| 成人三级做爰电影| 免费日韩欧美在线观看| av有码第一页| 久久久精品94久久精品| 狠狠精品人妻久久久久久综合| 成人毛片60女人毛片免费| 一二三四中文在线观看免费高清| 国产毛片在线视频| 一级片'在线观看视频| 悠悠久久av| 婷婷色麻豆天堂久久| 一本大道久久a久久精品| 成年人免费黄色播放视频| 亚洲精品国产色婷婷电影| 黄片小视频在线播放| 我的亚洲天堂| 不卡视频在线观看欧美| 国产精品欧美亚洲77777| av视频免费观看在线观看| 亚洲综合色网址| 欧美日韩亚洲高清精品| 男人操女人黄网站| 啦啦啦中文免费视频观看日本| 秋霞伦理黄片| 国产激情久久老熟女| 国产色婷婷99| 日韩av免费高清视频| 一区福利在线观看| 夫妻午夜视频| 国产午夜精品一二区理论片| 国产男女内射视频| 国产xxxxx性猛交| 国产一区亚洲一区在线观看| 无遮挡黄片免费观看| 欧美人与性动交α欧美精品济南到| 亚洲三区欧美一区| 国产成人精品久久二区二区91 | 国产黄色视频一区二区在线观看| 成年美女黄网站色视频大全免费| 中文字幕精品免费在线观看视频| 欧美日韩亚洲综合一区二区三区_| 卡戴珊不雅视频在线播放| 国产一区二区三区综合在线观看| 精品一区二区三区av网在线观看 | 最近的中文字幕免费完整| 男女下面插进去视频免费观看| 婷婷色综合www| 叶爱在线成人免费视频播放| 在线观看免费高清a一片| 精品酒店卫生间| 免费观看性生交大片5| 99re6热这里在线精品视频| 桃花免费在线播放| 亚洲精品中文字幕在线视频| 欧美日韩一级在线毛片| 亚洲精品一区蜜桃| 性少妇av在线| 老熟女久久久| 菩萨蛮人人尽说江南好唐韦庄| 伊人久久大香线蕉亚洲五| 99香蕉大伊视频| 亚洲国产成人一精品久久久| bbb黄色大片| 日韩制服丝袜自拍偷拍| 一本—道久久a久久精品蜜桃钙片| 午夜影院在线不卡| 一级毛片电影观看| 国产精品二区激情视频| 亚洲色图综合在线观看| 在线观看人妻少妇| 中文字幕色久视频| 午夜福利视频在线观看免费| 精品一区二区免费观看| 黑人巨大精品欧美一区二区蜜桃| 亚洲精品日韩在线中文字幕| 韩国高清视频一区二区三区| 亚洲精品自拍成人| 免费看av在线观看网站| 欧美精品一区二区免费开放| 欧美日韩av久久| 这个男人来自地球电影免费观看 | 九草在线视频观看| 亚洲,一卡二卡三卡| 亚洲人成网站在线观看播放| 中国国产av一级| 十八禁网站网址无遮挡| 精品国产国语对白av| 午夜福利网站1000一区二区三区| 麻豆精品久久久久久蜜桃| 最黄视频免费看| 国产一区有黄有色的免费视频| 国产精品欧美亚洲77777| 久久精品国产亚洲av涩爱| 天天躁日日躁夜夜躁夜夜| 久久久亚洲精品成人影院| 欧美亚洲日本最大视频资源| 色婷婷av一区二区三区视频| 日韩精品免费视频一区二区三区| 久久人人爽av亚洲精品天堂| 亚洲成av片中文字幕在线观看| 国精品久久久久久国模美| 人体艺术视频欧美日本| a级毛片黄视频| 青草久久国产| 日日啪夜夜爽| 亚洲欧洲日产国产| 热99久久久久精品小说推荐| 制服人妻中文乱码| 亚洲精品美女久久av网站| 亚洲婷婷狠狠爱综合网| 亚洲av日韩在线播放| 久久人人爽人人片av| 美女主播在线视频| 亚洲国产毛片av蜜桃av|