• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    The adaptive distributed learning based on homomorphic encryption and blockchain①

    2022-02-11 08:58:32YANGRuizhe楊睿哲ZHAOXuehuiZHANGYanhuaSIPengboTENGYinglei
    High Technology Letters 2022年4期

    YANG Ruizhe(楊睿哲),ZHAO Xuehui,ZHANG Yanhua②,SI Pengbo,TENG Yinglei

    (*Faculty of Information Technology,Beijing University of Technology,Beijing 100124,P.R.China)

    (**School of Electronic Engineering,Beijing University of Posts and Telecommunications,Beijing 100876,P.R.China)

    Abstract

    Key words:blockchain,distributed machine learning(DML),privacy,security

    0 Introduction

    Today,people and Internet of Things are generating unprecedented amounts of data.Based on these data,machine learning as a data analytics tool learns from data,recognizes patterns,and makes decisions,becoming an important part of artificial intelligence(AI).However,collecting abundant data to the central server is usually expensive and time consuming,or even impossible due to privacy and security issues.Therefore,distributed machine learning(DML)as an alternative approach has attracted more and more attention[1].DML allows the owner to retain the original data,compute and update the model parameter locally using its data,which is periodically transmitted to the central server.Subsequently,the central server updates the global model by aggregating the received local parameter.In this way,the privacy of the original data is protected,and also the huge cost of data collection is avoided.However,there are still cyber risks in the interaction and message transmission,and potential malicious attacks can still extract some of their data information.

    In order to resist attacks and enhance privacy,differential privacy and cryptography technologies are introduced in the existing work.Differential privacy(DP)can protect the sensitive information by adding random noise,therefore,attackers cannot infer from the results of some random algorithms[2-3].However,this differential noise needs to be smaller than the random noise of the sample,otherwise it will adversely affect the accuracy and privacy of the learning model[4].On the other side,cryptography with a mechanism of encryption can also protect the confidentiality of sensitive data.Therein,secure multiparty computing(SMC)and homomorphic encryption(HE)are the two main technologies.Comparatively,HE processing the underlying plaintext data in encrypted form without decryption is less complicated.In the current studies[5-7],the solutions using partial homomorphic encryption to protect the privacy in DML are more effective than those based on SMC.Considering the efficiency,Ref.[8]proposed an HE-based batch encryption technology to solve the encryption and communication bottleneck problems.

    In the DML system,another challenge is security.Ref.[9]analyzed the potential threats,including semi-honest and malicious internal and external attacks.Some researchers have proposed distributed learning systems based on blockchain[10-11].Blockchain is a representative paradigm of distributed control technology.It provides a peer-to-peer network that enables sensitive participants to communicate and collaborate in a secure manner without the need for a fully trusted third party or an authorized central node.Ref.[10]proposed a blockchain based decentralized learning system for medical data and utilized blockchain as an information exchange platform.Refs[12-14]presented blockchain based distributed learning processes to resist different threats.

    Currently the attention to both privacy and security in DML has become a key issue,however,the problems of computation and energy consumption brought by sophisticated privacy protection technology have been ignored in the most existing work.Here,a valuable business model is considered,in which the computing party as the beneficiary hires participants to jointly learn a desired global model.Specifically,the computing party aggregates the local models of the data from participants,but doesn’t share the aggregated results with any participants,so that it owns the value of the model.Therefore,an adaptive distributed machine learning scheme based on blockchain and homomorphic encryption is proposed.First,the Paillier encryption algorithm is used in the linear regression mode,where the computing party issues the global ciphertext parameters,and the participants always make ciphertext updates,so that the privacy of both the data and the model are satisfied.Second,the blockchain is introduced to complete the privacy-protected distributed linear regression process,which ensures the integrity and correctness of the updating and the final results of the model.Finally and most importantly,the total energy consumption of the system are analyzed from the perspective of computation complexity and computation resources,based on which a joint optimization of the computation resources allocation and adaptive aggregation in the case of limited system energy is given.Analysis and simulation results show the efficiency of the scheme in terms of security,privacy and learning convergence.

    1 System model

    In this section,a distributed learning framework is proposed based on homomorphic encryption and blockchain,which can complete the trusted learning process of the model without leaking private information.

    1.1 System overview

    The system completes a secure and private distributed learning based on the blockchain network,as shown in Fig.1.The system model includes two roles:the participants owning their private data and the computing party having requirements of data analytics(commercial requirements,etc.).Therefore,the computing party initiates the collaboration with participants to derive a global model from their data.Since the local model parameters of each participant are sensitive information having certain commercial value for others,the partial homomorphic encryption algorithm Paillier is used to encrypt them providing privacy protection.Besides,these ciphertext model parameters are uploaded to the blockchain,and the underlying distributed consensus guarantees the data sharing of model parameters in a reliable manner,thereby improving the transparency and trace ability of the learning.

    Fig.1 System model

    Specifically,it is assumed that the data required by computing party is distributed among participants,then the system nodes composed of the participantsPand the computing partyCare denoted asI={P,C},|I|=N+1,where|·|represents size of sets.In the setP={P1,P2,…,PN},the elementPi(i∈1,2,…,N)represents the participant having a sub-data setDiand the total data set isD={D1,D2,…,DN}.

    1.2 Distributed gradient descent

    For simplicity,the distributed learning of linear regression model is concerned.For each data sample(xij,yij),the loss function can be expressed as

    wherexijandyijrespectively denote thej-th input vector and output of datasetDi,andwiis the local parameter.Then the local loss function ofPibased on each datasetDiis

    wherewis the global parameter.

    During the learning process,t(t=1,2,…,T)is set as the number of iterations,andTis the total number of iterations.In each iteration,it contains a local update and a possible global aggregation.Whent=0,the local parameters of all participants are initialized to the same value.Whent>0,the local modelwi(t)ofPiis computed according to the gradient descent(GD)rule based on its previous iteration.After certain local updates,the computing party performs global aggregation to derive a global model,which is a weighted average of the local models of all participants.

    Therefore,the local update atPiusing gradient descent algorithm with local loss function can be described as

    whereη>0 is the learning rate.If global aggregation is performed by computing partyCat iterationt,the global model is defined as Eq.(5).

    1.3 Learning based on blockchain and Paillier

    It is defined that there areτlocal updates between every two global aggregation.It is assumed thatTis a multiple ofτ,andGis the total number of global aggregations,thenT=Gτ.

    In a cryptographic system,the public keyPkis used for encryption and the private keySkis used for decryption.Here,[[·]]is used to represent the ciphertext message encrypted by Paillier.During the encrypted learning,the computing party owns the public key and the private key to obtain the global model,while the participants only hold the public key,so that the model parameters are only shared among the participants in the form of ciphertext,ensuring the privacy of the system model.

    The encrypted learning process can be divided into model initialization,local update,and global aggregation.

    (1)Model initialization.The computing party and participants as the blockchain nodes,form a pointto-point decentralized network.Before learning,the key pair(Pk,Sk)is generated,then all nodes agree on thePkand the hyperparameters of the learning model,such as the learning rate and the initial local ciphertext model,etc..

    (2)Local update.In the ciphertext state,the participants use the gradient descent algorithm to update the local model in the current iteration based on the local ciphertext model completed in the previous iteration.During the encrypted updating,the participants compute local gradient using the homomorphic property[[am1+bm2]]=[[m1]]a·[[m2]]b.At iterationt,the local ciphertext gradient ofPican be written as

    where[[?i,kt]]is the ciphertext gradient of the local loss functionFi(wi(t))to thek-th element inwi(t)=[wi,1(t),wi,2(t),…,wi,k(t),…].Then[[?]]can be derived as

    wherenis the parameter for the key[15],and[[?ij,kt]]is the ciphertext gradient corresponding to samplexij,kand can be obtained by

    Based on the above theory and Eq.(4),the element[[wi,k(t)]]in the local ciphertext model ofPiat iterationtcan be expressed as

    Using Eqs(8),(7)and(9),the participants can obtain[[wi(t)]].Afterτlocal updates,each participant encapsulates local ciphertext model into a transactionELWPi=(t,[[wi(t)]])and broadcasts it to other nodes.

    (3)Global aggregation.The computing partyCperforms global aggregation afterτlocal updates.Specifically,the computing party collectsNtransactionsELWP1,…,ELWPNfrom the participants to obtain the local ciphertext model[[wi(t)]]at iterationt,and updates the global model in the encryption as

    Then,the consensus based on PBFT protocol is performed.It is considered thatCas the primary node,and it is responsible to pack transactions containing the global ciphertext model and the received local ciphertext model into the new blockBg

    whereg=1,2,…,G;the number of blocks is the same as the number of global aggregations.As the replicas in the consensus,Piperforms verification,including the signature of each transaction,MAC,and the correctness of Eq.(10)according to the public key.

    2 Joint optimization for computation resources and adaptive aggregation

    The distributed learning,especially the computation and learning in encryption,will consume lots of computing and energy resources.For the computing partyCto obtain the value of its data analysis,it is necessary to keep operating costs low.Therefore,how to effectively use the given resources to optimize the learning effect(minimize the loss function)has become an important issue.To solve this problem,the computation cost and energy consumption at each node is analyzed,based on which the convergence performance of the model is improved by optimizing the computation resources allocation and the adaptive aggregation(dynamically adjust aggregation frequencyτ).

    2.1 Resources consumption and analysis

    For resources consumption,from the aspect of learning(including local learning and global learning)and the blockchain consensus respectively to analyze it,it is assumed that the computing resources used by nodei(i∈I)for learning and consensus are(CPU cycle frequency)and,respectively.(1)In learning(subsection 1.3),it is assumed that the average of the CPU cycles consumed by the participants and the computing party to complete a step of ciphertext calculation isμ1CPU cycles,and the average of the CPU cycles consumed to complete a step of plaintext calculation isμ2CPU cycles.

    Local update.Pi'(i'∈I,i'=1,…,N)updates the local ciphertext model as shown in Eq.(9),and broadcasts it to the blockchain in the form ofELWPi'transaction.In this step,the computation complexity isO(|w||Di'|).Accordingly,the computation costsΔand computation timeεare

    Global aggregation.The computing partyC(C=i″∈I,i″=N+1)collectsELWPi'fromPi',and updates the global ciphertext model as shown in Eq.(10).In this step,the computation complexity isO(N|w|).Since Paillier cannot handle the problem of ciphertext multiplication,the computing party needs to decrypt theELWPi'transaction,and computes the optimization parameterφ,δ(see subsection 2.2 for details).Its computation complexity isO(N|w||Di'|).The computation costsΔand computation timeεare

    (2)In the PBFT consensus run between participants and the computing party,it can resist at mostF=3-1(N-1)failed nodes.It is assumed that the average value of the CPU cycles consumed by computing tasks isα,and generating or verifying a signature and generating or verifying a MAC requireβandθCPU cycles,respectively.The consensus protocol consists of five steps.

    Step 1Request.Participants submitKtransactions containing model parameters to computing partyC(primary nodei″).The primary nodei″packs the verified transactions into a new block.Afterwards,it broadcasts the produced block and pre-prepare messagestoother nodes for verification.The computation costsΔand computationtimeεare

    Step 2Pre-prepare.Pi'(i'≠i″)as replica receives a new block of pre-prepare message,then verifies the signature and MAC of the block and the signature and MAC of each transaction in turn.Finally,the computation result is verified according to the smart contract.If the result is verified successfully,Pi'will send the prepare messages to all the others.The computation costsΔand computation timeεare

    Step 3Prepare.All the nodes receive and check the prepare message to ensure that it is consistent with the pre-prepare message.Once the node receives 2Fprepare messages from others,it will send the commit messages to all the others.The computation costsΔand computation timeare

    Step 4Commit.All the nodes receive and check the commit message to ensure that it is consistent with the prepare message.Once the node receives 2Fcommit messages from others,it will send the reply message to the primary node.The computation costsand computation timeare

    Step 5Reply.The primary node receives and checks the reply messages.After receiving 2Freply messages,the new block will take effect and be added to the blockchain.The computation costΔand computation timeare

    As analyzed above,the efficiency(computation time)of a node in each step is constrained by computation cost and computation resources.Following themodel in Ref.[14],the energy consumption of local update can be given as

    the energy consumption of global aggregation can be given as

    2.2 Optimization model and solutions

    For the loss functionF(w),linear regression satisfies the following assumptions[16].Fi(w')andF(w)are convex,φ-smooth.For anyw,w'.‖?Fi(w)-?Fi(w')‖≤φ‖w-w'‖to capture the divergence between the gradient of a local loss function and the gradient of the global loss function.This divergence has to do with how data is distributed across different nodes,and the measure factorsφ,δwill affect the optimization model.For anyt,=∑i|Di|/|D|is defined to approximateφ,where

    Therefore,the optimization model is defined as

    C5:τ≤τmax

    where the loss function is minimized under the constraint ofC1-C5.The total system energyEis limited byC1,and the computation resources of the node is limited byC2.Moreover,the total time in the learning process and the consensus process are limited byC3 andC4,respectively,whereξmakes the time of the two processes consistent.Sinceτis unbounded and the problem is difficult to solve,the search space is restricted byτmaxinC5 and on integers.

    Since the denominator of Eq.(25)is always positive,then combining Eq.(21)andC1,the parameterTcan be replaced by the optimal valueT=(Eτ)(τFJ(f)+FQ(f))-1.Note thatφandδin Eq.(25)are ideal values,and only the alternative approximated^φandδ^can be updated along the learning.In each global aggregation,the computation resources allocated for each node to complete each step and the aggregation frequencyτare optimized by the convex optimization theory.The joint optimization is repeated to decide the global aggregation until the energy of the system is consumed out,as shown in algorithm 1.

    Algorithm 1 Process of learning and optimization Input:E,η,B0,ξ Output:[[w(T)]]1.Initial t=0,τ=1,R=0//R is the energy count 2.Initial E'=E and use Pk to obtain[[w(0)=0]]3.Initial f,evenly distributed to each process 4.while(t≥0&E>0)do 5. for i=1,2,3,…,N do 6. Pi downloads global parameter from blockchain 7.Pi completesτlocal updates 8. Pi broadcasts ELWPi 9. end for 10. t=t+τ 11. C receives[[wi(t)]]from Pi 12. C aggregates[[w(t)]]and calculates^δ,^φ 13. C calculates R=R+τFJ(f)+FQ(f)14. C updates E=E'-R 15. C uses Eq.(25)to obtain the optimal f andτ,whereτ∈[1,τmax]16. if R>E'then 17. reduce toτthe maximum value within E'18. set E=0 19. end if 20. C creates a block and upload to blockchain 21.end while

    3 Simulation and analysis

    3.1 Security and privacy analysis

    The system addresses data privacy and model security by combining distributed machine learning with homomorphic encryption and blockchain.

    For privacy,the Palillier encryption algorithm is used to protect the model parameters,where the encrypted parameters prevent attackers from obtaining local data by eavesdropping on the broadcast and also protect every model parameter from other participants.Only the computing party as the sponsor can know the parameters.

    For security,blockchain is used to make point-topoint interaction between computing party and participants to ensure the reliability of the system.Through the PBFT consensus,the participants are able to verify transactions and the updated ciphertext model from the beginning to the end results,which confirms the contribution of each participant.

    3.2 Simulation

    In this section,the proposed joint optimization scheme for computation resources and adaptive aggregation is simulated.To verify its performance,the proposed joint optimization is compared with the other two optimization cases using the loss function value as a metric.

    Only-aggregation.Ref.[16]proposed an adaptive aggregation algorithm.Assuming that the computation resources are fixed(the computation resources allocated to each node in different steps are equal and fixed),only the aggregation frequency is adaptively optimized so that the loss function reaches minimum.

    Only-resource.A comparison scheme for joint optimization,where the aggregation frequency is fixed,and only the allocation of computation resources is optimized.

    In order to prove the generality of the joint optimization,it is validated on the Boston house price dataset with 14 features and the fish toxicity dataset with 7 features,respectively.506 pieces of sample data are selected on each dataset and all sample data are preprocessed to the interval[-1,1].80% of them are selected for training and others for testing.For simulations,the parameter settings are shown in Table 1.

    Table 1 The simulation parameters

    Fig.2 and Fig.3 show the relationship between the loss value and the aggregation frequencyτ.In order to find the optimal fixedτin the only-resources optimization through simulations,it is considered that the value ofτis from 10 to 90.It can be seen that the optimal value ofτvaries with different number of participants and dataset,and a fixed value ofτwill not be suitable for all cases.Note that this optimal value can not be obtained in practical work.In order to facilitate comparison,according to the changing trend of the curve,τ=25 is set in the Boston dataset andτ=30 is set in the fish dataset.

    Fig.2 Loss value with differentτin Boston dataset

    Fig.3 Loss value with differentτin fish dataset

    Fig.4 shows the relationship between system energy and loss value when the number of participants isN=5.Since the Boston dataset has more feature dimensions than the fish dataset,it consumes more energy during learning and requires more energy to converge(i.e.different orders of magnitude on the horizontal axis).It can be seen that with the increase of the system energy,the loss values of the three cases decrease,and the joint optimization scheme has the smallest loss value.It adjusts the computation resources allocation of each node in each step according to the complexity,so that the energy consumption of each iteration reaches an ideal value and the distributed learning process can complete more iterations under the constraint of limited system energy.When the system energy is small,the effect of the joint optimization scheme is more obvious.In addition,there is a small difference between the loss value of joint optimization and only-resources optimization.It can be seen that in the distributed machine learning,optimizing computation resources allocation plays a more important role than optimizing aggregation frequency.

    Fig.4 Loss value with different energy consumption

    Fig.5 shows the relationship between the number of participants and the loss value.It can be seen that as the number of participants increases,the change trend of the loss value under different datasets is different.The reason is that the loss value is not only affected by the number of participants,but also related to the data distribution.With the same number of participants,the proposed joint optimization scheme has the smallest loss value.

    4 Conclusions

    A distributed learning scheme is proposed using homomorphic encryption and blockchain as the privacy and security guarantee.In the scheme,data is leaved to its owner and only the encrypted model parameters derived from data are transmitted to the global aggregation,all of which are recorded and verified by blockchain consensus.In this way,the privacy of both the data and model as well as the security of the learning are ensured.Most importantly,the computation resources and the adaptive aggregation in the distributed learning and consensus are optimized.Simulation results show the efficiency of the scheme.

    久久香蕉激情| 夜夜躁狠狠躁天天躁| 极品教师在线免费播放| 纯流量卡能插随身wifi吗| 欧美久久黑人一区二区| 久久精品亚洲av国产电影网| 亚洲第一av免费看| 国产成人精品久久二区二区免费| 久久99一区二区三区| 伊人久久大香线蕉亚洲五| 我的亚洲天堂| 黄色片一级片一级黄色片| av有码第一页| 欧美日韩亚洲高清精品| 中文字幕高清在线视频| 自拍欧美九色日韩亚洲蝌蚪91| 亚洲精华国产精华精| av欧美777| 亚洲三区欧美一区| 色播在线永久视频| 午夜老司机福利片| 日本免费a在线| 精品国产一区二区三区四区第35| 国产精品久久久av美女十八| 在线观看www视频免费| 少妇的丰满在线观看| 国产精品自产拍在线观看55亚洲| 久热这里只有精品99| 侵犯人妻中文字幕一二三四区| 高潮久久久久久久久久久不卡| 成人黄色视频免费在线看| 日韩欧美免费精品| 美女大奶头视频| 亚洲五月色婷婷综合| 正在播放国产对白刺激| 真人一进一出gif抽搐免费| 最近最新中文字幕大全免费视频| 久久精品国产亚洲av香蕉五月| 国产欧美日韩一区二区三区在线| av电影中文网址| 精品一区二区三区av网在线观看| 热99re8久久精品国产| videosex国产| 一二三四社区在线视频社区8| 久久人人精品亚洲av| 久久 成人 亚洲| 大型av网站在线播放| 亚洲av美国av| 国产亚洲av高清不卡| 乱人伦中国视频| 国产精品永久免费网站| 欧美在线一区亚洲| 欧美黄色片欧美黄色片| 久久国产精品影院| 91老司机精品| 99久久精品国产亚洲精品| 日本黄色日本黄色录像| 九色亚洲精品在线播放| 午夜精品在线福利| 麻豆久久精品国产亚洲av | √禁漫天堂资源中文www| 亚洲精品中文字幕一二三四区| 丁香欧美五月| 9热在线视频观看99| 国产精品久久电影中文字幕| 一区二区三区精品91| 99国产综合亚洲精品| 99精品欧美一区二区三区四区| 三上悠亚av全集在线观看| 99久久99久久久精品蜜桃| 国产精品永久免费网站| 国产色视频综合| 性色av乱码一区二区三区2| 亚洲精品久久成人aⅴ小说| 日韩免费av在线播放| 一边摸一边抽搐一进一出视频| 欧美大码av| 视频区图区小说| 国产成人精品无人区| 女生性感内裤真人,穿戴方法视频| 在线视频色国产色| 亚洲人成网站在线播放欧美日韩| 在线观看舔阴道视频| 欧美丝袜亚洲另类 | 在线观看一区二区三区激情| 亚洲一区二区三区色噜噜 | 在线十欧美十亚洲十日本专区| 天堂影院成人在线观看| 首页视频小说图片口味搜索| 桃色一区二区三区在线观看| 日韩大码丰满熟妇| 五月开心婷婷网| 欧美老熟妇乱子伦牲交| 国产精品久久久av美女十八| 午夜免费成人在线视频| 悠悠久久av| 一边摸一边做爽爽视频免费| 午夜日韩欧美国产| 亚洲精品国产色婷婷电影| 日韩有码中文字幕| 久久精品91无色码中文字幕| 色在线成人网| 激情视频va一区二区三区| 午夜两性在线视频| 午夜视频精品福利| 精品卡一卡二卡四卡免费| 夫妻午夜视频| 中出人妻视频一区二区| 操美女的视频在线观看| 国产成人影院久久av| 黑丝袜美女国产一区| 黄色女人牲交| 亚洲伊人色综图| 国产无遮挡羞羞视频在线观看| 亚洲欧美一区二区三区久久| 91老司机精品| 国产精品 欧美亚洲| 天天添夜夜摸| 最好的美女福利视频网| 两个人看的免费小视频| 久久精品成人免费网站| 91成人精品电影| 色老头精品视频在线观看| 美女福利国产在线| 国产高清videossex| 国产精品偷伦视频观看了| 黄频高清免费视频| 久久久久久大精品| 亚洲精品一区av在线观看| 国产精品99久久99久久久不卡| 免费观看精品视频网站| 亚洲成a人片在线一区二区| 久久久国产成人免费| 午夜福利免费观看在线| 亚洲视频免费观看视频| 一区二区日韩欧美中文字幕| 最好的美女福利视频网| 日韩中文字幕欧美一区二区| 搡老熟女国产l中国老女人| 搡老乐熟女国产| 精品日产1卡2卡| 欧洲精品卡2卡3卡4卡5卡区| 97超级碰碰碰精品色视频在线观看| 这个男人来自地球电影免费观看| 国产黄a三级三级三级人| 80岁老熟妇乱子伦牲交| 最好的美女福利视频网| 色哟哟哟哟哟哟| 丰满饥渴人妻一区二区三| 又紧又爽又黄一区二区| 欧美日韩瑟瑟在线播放| 色播在线永久视频| 亚洲精品av麻豆狂野| 久久人人爽av亚洲精品天堂| 老司机午夜福利在线观看视频| 免费在线观看亚洲国产| 视频区图区小说| 久久久国产成人免费| 麻豆av在线久日| 高清黄色对白视频在线免费看| 久久中文字幕一级| 国产成年人精品一区二区 | 国产成人av教育| 久99久视频精品免费| 国产av又大| 欧美色视频一区免费| 人人妻人人澡人人看| 精品久久久久久电影网| av福利片在线| 这个男人来自地球电影免费观看| 在线观看午夜福利视频| 午夜免费激情av| 国产成人影院久久av| 久9热在线精品视频| 国产精品秋霞免费鲁丝片| 亚洲一区高清亚洲精品| 性少妇av在线| 午夜精品久久久久久毛片777| 麻豆国产av国片精品| 18美女黄网站色大片免费观看| 亚洲五月天丁香| 国内毛片毛片毛片毛片毛片| 色综合站精品国产| 国产一区二区在线av高清观看| 91麻豆精品激情在线观看国产 | 久久99一区二区三区| 亚洲七黄色美女视频| 欧美成人性av电影在线观看| 黄色视频,在线免费观看| av在线天堂中文字幕 | 黄片大片在线免费观看| 久久久久久亚洲精品国产蜜桃av| 欧美另类亚洲清纯唯美| 日韩有码中文字幕| 国产成人一区二区三区免费视频网站| 视频区图区小说| 国产精品亚洲av一区麻豆| 老司机午夜十八禁免费视频| 久久久久久亚洲精品国产蜜桃av| 美女 人体艺术 gogo| 久久久久久亚洲精品国产蜜桃av| 一区在线观看完整版| 在线观看免费午夜福利视频| 欧美一级毛片孕妇| 亚洲精品国产一区二区精华液| 大码成人一级视频| 久久伊人香网站| 国产aⅴ精品一区二区三区波| 人人澡人人妻人| 国产99久久九九免费精品| 久久 成人 亚洲| 超碰成人久久| 久久香蕉精品热| 国产99白浆流出| 丝袜在线中文字幕| 亚洲成人久久性| 老汉色∧v一级毛片| 在线观看www视频免费| 一边摸一边做爽爽视频免费| 国产成人欧美| 亚洲全国av大片| 色在线成人网| 一a级毛片在线观看| 嫩草影视91久久| 神马国产精品三级电影在线观看 | av国产精品久久久久影院| 50天的宝宝边吃奶边哭怎么回事| 中文字幕av电影在线播放| 热re99久久国产66热| 日韩国内少妇激情av| 精品国产超薄肉色丝袜足j| 99久久人妻综合| 99re在线观看精品视频| 国产精品久久久人人做人人爽| 午夜福利一区二区在线看| 欧美激情 高清一区二区三区| a级片在线免费高清观看视频| 日本黄色日本黄色录像| 欧美日韩亚洲综合一区二区三区_| 久久精品国产亚洲av高清一级| 又大又爽又粗| 免费不卡黄色视频| 最新美女视频免费是黄的| 精品国产国语对白av| www日本在线高清视频| 亚洲中文字幕日韩| 91成人精品电影| 国产成年人精品一区二区 | 欧美中文日本在线观看视频| 两个人看的免费小视频| 亚洲av片天天在线观看| 国产伦一二天堂av在线观看| 一二三四在线观看免费中文在| 日本免费一区二区三区高清不卡 | 女人被躁到高潮嗷嗷叫费观| 最好的美女福利视频网| 久久精品aⅴ一区二区三区四区| 久久久久久久午夜电影 | 最近最新中文字幕大全电影3 | 在线观看66精品国产| 男人舔女人的私密视频| 免费少妇av软件| 一二三四社区在线视频社区8| 亚洲av第一区精品v没综合| 久久午夜亚洲精品久久| 久久久久国产一级毛片高清牌| 啦啦啦在线免费观看视频4| 亚洲av五月六月丁香网| 亚洲国产欧美日韩在线播放| 国产av一区二区精品久久| 视频区欧美日本亚洲| 99国产精品一区二区蜜桃av| 精品熟女少妇八av免费久了| 精品久久久久久成人av| 亚洲国产欧美网| 黄色 视频免费看| 日本 av在线| 90打野战视频偷拍视频| 久久精品成人免费网站| 亚洲成国产人片在线观看| 国产成人av激情在线播放| 岛国在线观看网站| 纯流量卡能插随身wifi吗| 国产主播在线观看一区二区| 久久久久久久午夜电影 | 亚洲va日本ⅴa欧美va伊人久久| 麻豆久久精品国产亚洲av | 久99久视频精品免费| 国产人伦9x9x在线观看| 在线十欧美十亚洲十日本专区| 成人av一区二区三区在线看| 日本欧美视频一区| 国产精品久久久久久人妻精品电影| 男女高潮啪啪啪动态图| 9191精品国产免费久久| 夜夜看夜夜爽夜夜摸 | 在线免费观看的www视频| 91av网站免费观看| 国产国语露脸激情在线看| 91麻豆av在线| 免费观看精品视频网站| 黄色成人免费大全| 午夜免费成人在线视频| www.999成人在线观看| 啪啪无遮挡十八禁网站| 性少妇av在线| 欧美最黄视频在线播放免费 | 超碰97精品在线观看| 99re在线观看精品视频| 亚洲熟妇熟女久久| 这个男人来自地球电影免费观看| 中文字幕色久视频| 99久久99久久久精品蜜桃| 久久久久久人人人人人| 亚洲熟妇中文字幕五十中出 | 在线观看免费高清a一片| 美女国产高潮福利片在线看| 91成人精品电影| 天天添夜夜摸| 99香蕉大伊视频| 国产精品一区二区免费欧美| 亚洲精品成人av观看孕妇| 天天影视国产精品| 精品国产乱码久久久久久男人| 极品人妻少妇av视频| 亚洲色图综合在线观看| 久久婷婷成人综合色麻豆| 国产黄色免费在线视频| 欧美成狂野欧美在线观看| 午夜成年电影在线免费观看| 精品久久久精品久久久| 三上悠亚av全集在线观看| 亚洲熟妇中文字幕五十中出 | 国产伦一二天堂av在线观看| 少妇粗大呻吟视频| 国产精品国产av在线观看| 亚洲av第一区精品v没综合| 十分钟在线观看高清视频www| 一级毛片女人18水好多| 丝袜美足系列| 亚洲欧美一区二区三区久久| 曰老女人黄片| 国产乱人伦免费视频| 伊人久久大香线蕉亚洲五| 深夜精品福利| 1024视频免费在线观看| 午夜久久久在线观看| 中文字幕av电影在线播放| 黑人巨大精品欧美一区二区mp4| 日韩欧美一区二区三区在线观看| 在线观看免费日韩欧美大片| av天堂久久9| 日韩中文字幕欧美一区二区| 日韩欧美三级三区| 日韩高清综合在线| 欧美大码av| 在线观看免费视频日本深夜| 男女午夜视频在线观看| 极品教师在线免费播放| 国产国语露脸激情在线看| 亚洲精品久久午夜乱码| 亚洲精品国产区一区二| 亚洲中文av在线| 亚洲成a人片在线一区二区| 热99国产精品久久久久久7| 亚洲国产毛片av蜜桃av| 丝袜美腿诱惑在线| 日本vs欧美在线观看视频| 在线观看舔阴道视频| 女人被躁到高潮嗷嗷叫费观| 亚洲欧美日韩无卡精品| 黄色 视频免费看| 亚洲一区中文字幕在线| 女人精品久久久久毛片| 黄片大片在线免费观看| 国产蜜桃级精品一区二区三区| 女人高潮潮喷娇喘18禁视频| 国产欧美日韩精品亚洲av| www.www免费av| 久久青草综合色| 韩国精品一区二区三区| 黑丝袜美女国产一区| 老司机福利观看| 国产乱人伦免费视频| 国产av一区二区精品久久| 男人操女人黄网站| 9191精品国产免费久久| 国产一区在线观看成人免费| 精品国产乱码久久久久久男人| www.精华液| 中文亚洲av片在线观看爽| 男女床上黄色一级片免费看| 国产麻豆69| 欧美国产精品va在线观看不卡| 亚洲伊人色综图| 99国产综合亚洲精品| 可以免费在线观看a视频的电影网站| 老司机亚洲免费影院| 欧美黑人精品巨大| a在线观看视频网站| 中文字幕最新亚洲高清| 超碰97精品在线观看| 亚洲av日韩精品久久久久久密| 狂野欧美激情性xxxx| 色老头精品视频在线观看| 18禁国产床啪视频网站| 亚洲一卡2卡3卡4卡5卡精品中文| 精品卡一卡二卡四卡免费| 亚洲成人精品中文字幕电影 | 国产成人av教育| 免费在线观看视频国产中文字幕亚洲| 国产又色又爽无遮挡免费看| 久久午夜亚洲精品久久| 不卡av一区二区三区| 老汉色av国产亚洲站长工具| 免费久久久久久久精品成人欧美视频| 老司机在亚洲福利影院| 久久精品国产清高在天天线| 成年人黄色毛片网站| 久久精品国产综合久久久| 亚洲国产看品久久| 久久九九热精品免费| 一本综合久久免费| 他把我摸到了高潮在线观看| 69精品国产乱码久久久| 国产亚洲精品综合一区在线观看 | 大型黄色视频在线免费观看| 午夜亚洲福利在线播放| www.熟女人妻精品国产| 老熟妇仑乱视频hdxx| 国产有黄有色有爽视频| 最近最新中文字幕大全电影3 | 天堂动漫精品| 亚洲免费av在线视频| 老司机在亚洲福利影院| 亚洲精品国产精品久久久不卡| 亚洲成国产人片在线观看| 亚洲五月色婷婷综合| 国产三级黄色录像| 校园春色视频在线观看| 999久久久精品免费观看国产| 久久精品影院6| 婷婷丁香在线五月| 一边摸一边做爽爽视频免费| 天堂√8在线中文| 国产成人一区二区三区免费视频网站| 丝袜美足系列| 久久国产乱子伦精品免费另类| 日韩人妻精品一区2区三区| 两个人看的免费小视频| 纯流量卡能插随身wifi吗| 久久中文字幕人妻熟女| 欧美中文综合在线视频| 美女扒开内裤让男人捅视频| 91大片在线观看| www.自偷自拍.com| 久久久久久久精品吃奶| 91大片在线观看| 嫩草影视91久久| 在线看a的网站| 淫妇啪啪啪对白视频| 日韩欧美国产一区二区入口| 狠狠狠狠99中文字幕| 窝窝影院91人妻| 亚洲片人在线观看| 久99久视频精品免费| 老熟妇乱子伦视频在线观看| 久久香蕉国产精品| 国产成人啪精品午夜网站| 精品人妻1区二区| 国产黄色免费在线视频| 国产亚洲精品一区二区www| 欧美激情 高清一区二区三区| 久久久久久亚洲精品国产蜜桃av| 日韩精品中文字幕看吧| 久久久久久久午夜电影 | 国产真人三级小视频在线观看| 中文字幕人妻熟女乱码| 欧美精品亚洲一区二区| 免费高清在线观看日韩| 国产精品一区二区免费欧美| 久久久国产欧美日韩av| 一级作爱视频免费观看| 国产精品98久久久久久宅男小说| 一个人免费在线观看的高清视频| 久久热在线av| 亚洲国产看品久久| 热99re8久久精品国产| 久久精品91蜜桃| 99国产精品一区二区蜜桃av| 麻豆一二三区av精品| 亚洲,欧美精品.| 亚洲欧美精品综合久久99| 欧美日韩亚洲高清精品| 色精品久久人妻99蜜桃| 韩国av一区二区三区四区| 91精品三级在线观看| 亚洲精品国产一区二区精华液| 大型黄色视频在线免费观看| 超色免费av| 激情在线观看视频在线高清| 五月开心婷婷网| 最近最新中文字幕大全电影3 | 久久香蕉国产精品| 夫妻午夜视频| av中文乱码字幕在线| 久久久国产成人免费| 久热爱精品视频在线9| 久久中文字幕一级| 久久国产亚洲av麻豆专区| 亚洲男人天堂网一区| 国产av一区在线观看免费| 日本免费一区二区三区高清不卡 | 亚洲一码二码三码区别大吗| 久久青草综合色| 男女做爰动态图高潮gif福利片 | 亚洲九九香蕉| 丁香六月欧美| 欧美黄色片欧美黄色片| 国产激情久久老熟女| 久久欧美精品欧美久久欧美| 国产av又大| 国产精品1区2区在线观看.| 欧美日韩乱码在线| 黄片播放在线免费| 日韩欧美一区二区三区在线观看| 精品欧美一区二区三区在线| 亚洲自拍偷在线| 在线观看免费午夜福利视频| 亚洲人成网站在线播放欧美日韩| 国产成人精品久久二区二区91| 韩国av一区二区三区四区| 美女高潮到喷水免费观看| 夜夜躁狠狠躁天天躁| 亚洲,欧美精品.| 国产片内射在线| 国产主播在线观看一区二区| 人妻久久中文字幕网| 国产在线观看jvid| 久久久国产成人精品二区 | 久久性视频一级片| 一边摸一边做爽爽视频免费| 国产精华一区二区三区| 老熟妇乱子伦视频在线观看| 人成视频在线观看免费观看| 99国产精品一区二区三区| 嫩草影院精品99| 国产亚洲精品综合一区在线观看 | 成在线人永久免费视频| 欧美最黄视频在线播放免费 | 中文字幕人妻熟女乱码| 国产成人欧美| 欧美人与性动交α欧美精品济南到| 身体一侧抽搐| 国产色视频综合| 色哟哟哟哟哟哟| 两性夫妻黄色片| 超碰97精品在线观看| 午夜精品在线福利| 热re99久久精品国产66热6| 两性夫妻黄色片| 精品一区二区三区视频在线观看免费 | 啦啦啦在线免费观看视频4| 亚洲熟妇熟女久久| 一级毛片精品| 国产一区二区激情短视频| 最近最新中文字幕大全电影3 | 两性夫妻黄色片| 久热爱精品视频在线9| 又大又爽又粗| 日韩中文字幕欧美一区二区| a在线观看视频网站| 亚洲欧美激情在线| 久久午夜亚洲精品久久| 免费在线观看影片大全网站| 午夜精品久久久久久毛片777| 午夜视频精品福利| 香蕉国产在线看| 成人av一区二区三区在线看| 在线观看日韩欧美| 97碰自拍视频| 女人精品久久久久毛片| 亚洲国产精品999在线| 女人被狂操c到高潮| 国产精品久久久av美女十八| 国产成人欧美在线观看| 国产真人三级小视频在线观看| 老司机福利观看| 婷婷六月久久综合丁香| 欧美大码av| 久久精品影院6| 男男h啪啪无遮挡| 国产一区二区三区在线臀色熟女 | 久久久久久久久久久久大奶| 美国免费a级毛片| 久久精品国产亚洲av高清一级| a级毛片黄视频| 一级,二级,三级黄色视频| 国产亚洲精品久久久久久毛片| 在线观看免费日韩欧美大片| 丝袜在线中文字幕| 精品国产一区二区久久| 美女大奶头视频| 亚洲avbb在线观看| 在线永久观看黄色视频| 男女高潮啪啪啪动态图| 99热只有精品国产| 看免费av毛片| 亚洲av电影在线进入| 中文字幕精品免费在线观看视频| 51午夜福利影视在线观看| 亚洲午夜精品一区,二区,三区| 免费看十八禁软件| 久久婷婷成人综合色麻豆| 精品一区二区三区四区五区乱码|