• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A Visual-Based Gesture Prediction Framework Applied in Social Robots

    2022-01-26 00:36:00BixiaoWuJunpeiZhongandChenguangYang
    IEEE/CAA Journal of Automatica Sinica 2022年3期

    Bixiao Wu,Junpei Zhong,,and Chenguang Yang,

    Abstract—In daily life,people use their hands in various ways for most daily activities.There are many applications based on the position,direction,and joints of the hand,including gesture recognition,gesture prediction,robotics and so on.This paper proposes a gesture prediction system that uses hand joint coordinate features collected by the Leap Motion to predict dynamic hand gestures.The model is applied to the NAO robot to verify the effectiveness of the proposed method.First of all,in order to reduce jitter or jump generated in the process of data acquisition by the Leap Motion,the Kalman filter is applied to the original data.Then some new feature descriptors are introduced.The length feature,angle feature and angular velocity feature are extracted from the filtered data.These features are fed into the long-short time memory recurrent neural network(LSTM-RNN) with different combinations.Experimental results show that the combination of coordinate,length and angle features achieves the highest accuracy of 99.31%,and it can also run in real time.Finally,the trained model is applied to the NAO robot to play the finger-guessing game.Based on the predicted gesture,the NAO robot can respond in advance.

    I.INTRODUCTION

    CURRENTLY,computers are becoming more and more popular,and the demand for human-robot interaction is increasing.People pay more attention to research of new technologies and methods applied to human-robot interactions[1]–[3].Making human-robot interaction as natural as daily human-human interaction is the ultimate goal.Gestures have always been considered an interactive technology that can provide computers with more natural,creative and intuitive methods.Gestures have different meanings in different disciplines.In terms of interaction design,the difference between using gestures and using a mouse and keyboard,etc.,is obvious,i.e.,gestures are more acceptable to people.Gestures are comfortable and less limited by interactive devices,and they can provide more information.Compared with traditional keyboard and mouse control methods,the direct control of the computer by hand movement has the advantages of being natural and intuitive.

    Gesture recognition [4] refers to the process of recognizing the representation of dynamic or static gestures and translating them into some meaningful instructions.It is an extremely significant research direction in the area of human-robot interaction technology.The method of realizing gesture recognition can be divided into two types: visual-based [5],[6] gesture recognition and non-visual-based gesture recognition.The study of non-vision approaches began in the 1970s.Non-vision methods always take advantage of wearable devices [7] to track or estimate the orientation and position of fingers and hands.Gloves are very common devices in this field,and they contain the sensory modules with a wired interface.The advantage of gloves is that their data do not need to be preprocessed.Nevertheless,they are very expensive for virtual reality applications.They also have wires,which makes them uncomfortable to wear.With the development of technology,current research on non-visual gesture recognition is mainly focused on EMG signals[8]–[11].However,EMG signals are greatly affected by noise,which makes it is difficult to process.

    Gesture recognition is based on vision and is less intrusive and contributes to a more natural interaction.It refers to the use of cameras [12]–[16],such as Kinect [17],[18] and Leap Motion [19],[20],to capture images of gestures.Then some algorithms are used to analyze and process the acquired data to get gesture information,so that the gesture can be recognized.It is also more natural and easy to use,becoming the mainstream way of gesture recognition.However,it is also a very challenging problem.

    By using the results of gesture recognition,the subsequent gesture of performers can be predicted.This process could be called gesture prediction,and it has wider applications.In recent years,with the advent of deep learning,many deep neural networks (DNN) are applied to gesture prediction.Zhanget al.[21] used an RNN model to predict gestures from raw sEMG signals.Weiet al.[22] combined a 3D convolutional residual network and bidirectional LSTM network to recognize dynamic gesture.Kumaret al.[23] proposed a multi modal framework based on hand features captured from Kinect and Leap Motion sensors to recognize gestures,using a hidden Markov model (HMM) and bidirectional long shortterm memory model (LSTM).The LSTM [24] has become an effective model for solving some learning problems related to sequence data.Hence,inspired by the previous works,we adopt the LSTM to predict gestures in our proposed framework.

    Fig.1.Pipeline of the proposed approach.

    In the method of gesture prediction,hand key point detection is one of the most important steps.In the early stage of technological development,the former mainly used color filters to segment the hands to achieve detection.However,this type of method relies on skin color,and the detection performance is poor when the hand is in a complex scene.Therefore,the researchers proposed a detection method based on 3D hand key points.The task goal of the 3D hand key point estimation is to locate the 3D coordinates of hand joints in a frame of depth image,mostly used in virtual immersive games,interactive tasks [25],[26],and so on.Leap Motion is a kind of equipment for 3D data extraction based on vision technology.This device could extract the position of the hand joints,orientation and the speed of the fingertips movement.Recently,Leap Motion has always been used by researchers for gesture recognition and prediction.For example,some scholars use it to recognize American sign language (ASL)[27],[28],and it has a high gesture recognition accuracy.Moreover,Zenget al.[29] proposed a gesture recognition method based on deterministic learning and joint calibration of the Leap Motion.And Marinet al.[30] developed a method to combine the Leap Motion and Kinect to calculate different features of hand,and a higher accuracy was obtained.In this work,we use the data of hand key points detected by the Leap Motion to predict gestures and utilize the gesture recognition results to play the finger-guessing game.This game contains three gestures: rock,paper and scissors.The winning rules of this game are: scissors wins paper,paper wins rock,rock wins scissors.Based on these game rules,this paper proposes a method to judge gestures in advance when the player has not completed the action.

    The combination of the Leap Motion and LSTM significantly improves human-robot interaction.The Leap Motion could track each joint of the hand directly and has the ability to recognize or predict gestures.Moreover,compared with other devices,the Leap Motion has higher localization precision.On the other hand,the LSTM can solve the prediction problem well in most cases,and it is one of the important algorithms of deep learning (DL).This work combines the strengths of the LSTM and Leap Motion to predict gestures.Leap Motion captures 21 three-dimensional joint coordinates in each frame,and the LSTM network is used to train and test these features.This work has some novel contributions:

    1) A method for predicting gestures based on the LSTM is proposed.The data of gestures is collected by the Leap Motion.

    2) In order to reduce or eliminate the jitter or jump generated in the process of acquiring data by the Leap Motion,the Kalman filter is applied to solve this problem effectively.

    3) We propose a reliable feature extraction method,which extracts coordinate features,length features,angle features and angular velocity features,and combines these features to predict gestures.

    4) We apply the trained model to the NAO robot and make it play the finger-guessing game with players,which effectively verifies the real-time and accuracy of the proposed approach.

    The rest part of this paper is structured as below: in Section II,the process of processing data is given.In Section III,the experiment of this work is introduced in detail and the effectiveness is verified in this section.Finally,Section IV makes a summary.The framework of this paper is shown in Fig.1.

    II.DATA PROCESSING

    A.Leap Motion Controller

    The structure of the Leap Motion is not complicated,as shown in Fig.2.The main part of the device includes two cameras and three infrared LEDs.They tracked infrared light outside the visible spectrum,which has a wavelength of 850 nanometers.Compared with other depth cameras,such as the Kinect,the information obtained from the Leap Motion is limited (only a few key points rather than complete depth information) and it works in smaller three-dimensional areas.However,it is more accurate to use Leap Motion to acquire data.Moreover,Leap Motion provides software that can recognize some movement patterns,including swipe,tap and so on.Developers can access some functions of Leap Motion through the application programming interface (API) to create new applications.For example,they can obtain information about the position and length of the user’s hand to recognize different gestures.

    Fig.2.The structure of the Leap Motion.

    Even though the manufacturers declare an accuracy of the Leap Motion in position measurement is around 0.01 mm,[31] shows that it is about 0.2 mm for static measurements and 0.4 mm for dynamic measurements in fact.And in the coordinates of the finger joints extracted by Leap Motion,there exists jitter or even jump,which could affect the accuracy of the experimental results.In order to reduce or eliminate these phenomena,this work takes advantage of the Kalman filter to correct the predicted position of hand joints.

    B.Data Acquisition

    Each finger is marked with a name: thumb,index,middle,ring,and pinky,including four bones (except thumb).As shown in Fig.3,the phalanx of the finger includes the metacarpal,proximal phalanx,middle phalanx,and distal phalanx.Particularly,the thumb has only three phalanges,one less than the other.In the algorithm design,we set the length of the thumb metacarpal bone to 0 to guarantee that all five fingers have the same number of phalanges,which is easy to programme.In this work,the main data acquired by the Leap Motion are as follows:

    1) Number of Detected Fingers:Num∈[1,5] is the number of fingers detected by Leap Motion.

    2) Position of the Finger Joints:Pi,i=1,2,3,...,20 contains the three-dimensional position of each finger joint.The Leap Motion provides a one-to-one map between coordinates and finger joints.

    3) Palm Center:Pc(x0,y0,z0) represents three-dimensional coordinates of the center of the palm area in 3D space.

    4) Fingertips Movement Speed:Vrepresents the speed in the three-dimensional direction of each fingertip detected by the Leap Motion.

    Fig.3.Definition of endoskeleton in Leap Motion.

    C.Kalman Filter

    1) Problem Formulation:In the process of gesture changes,the fingertips have the largest range of change and can more easily jitter or jump than other joints,therefore,the Kalman filter is used to process the data from fingertips.Compared with other filters,such as the particle filter,the Luenburger observer filter,etc.,the Kalman filter has sufficient accuracy and can effectively remove Gaussian noise.In addition,its low computational complexity meets the real-time requirements of this work.Therefore,the Kalman filter is used for this work.

    Suppose that the current position of the fingertips obtained by Leap Motion isPt,and the speed isVt.The Kalman filter assumes that these two variables obey a Gaussian distribution,and each variable has a mean value of μ,and variance of σ2.For clarity,Xtdenotes the best estimate at timet,andYtdenotes the covariance matrix.The equations ofXtandYtare as follows:

    2) The Prediction Process:We need to predict the current state (timet) according to the state of the last time (timet-1).This prediction process can be described as follows:

    where Δtis the time interval,which depends on the data acquisition rate of the Leap Motion,and α is the rate of speed change.

    The matrixFtis used to represent the prediction matrix,so(5) can be represented as follows:

    and through the basic operation of covariance,Ytcan be expressed as the following equation:

    3) Refining the Estimate With Measurements:From the measured sensor data,the current state of the system can be guessed roughly.However,due to uncertainty,some states may be closer to the real state than the measurements acquired from the Leap Motion directly.In this work,covarianceRtis used to express the uncertainty (such as the sensor noise),and the mean value of the distribution is defined asZt.

    Now,there are two Gaussian distributions,one near the predicted value and the other near the measured value.Therefore,two Gaussian distributions are supposed to be multiplied to calculate the optimal solution between the predicted value and the measured value of the Leap Motion,as shown in the following equations:

    where μ0,σ0represent the mean and variance of the predicted values,respectively.μ1,σ1represent the mean and variance of the measured values,respectively.μ′,σ′represent the mean and variance of the calculated values,respectively.

    By substituting (8) into (9),we can get the following equations:

    the same parts of (10) and (11) are represented byk

    Therefore,(10) and (11) can be converted as follows:

    4) Integrate All the Equations:In this section,the equations of the Kal man filter used in this paper are integrated.

    There are two Gaussian distributions,the prediction part

    and the measurement part

    We then put them into (13) and (14) to get the following equation:

    And according to (12),the Kalman gain is as follows:

    Next,the above three formulas are simplified.On both sides of (17)and(18),we left multiply the inverse matrix ofFt.On both sides of(18),weright multiply the inverse matrix of.We then can get the following simplified equation:

    is the new optimal estimation of the data collected by the Le ap Motion,which we putalong withinto the next prediction and update the equation,and iterate continuously.Through the above steps,the data collected by the Leap Motion could be more accurate.

    D.Feature Extraction

    Now,after filtering the original data,we analyze four features acquired from the filtered data.These features are introduced in the rest of this section.

    ● Coordinate:x,y,andzcoordinates of the hand joints obtained by the Leap Motion.

    ● Length:The distance from the fingertips to center of the hand.

    ● Angle:The angle between the Proximal phalanx and Intermediate phalanx of each finger (except Thumb).

    ● Angular velocity:The rate of the joints angle change.

    1) Coordinate Feature:As shown in Fig.4(a),this feature set represents the position of the finger joints in three dimensional space.The original data take the Leap Motion as the coordinate origin as shown in Fig.5.With the movement of the hand,the obtained data could change a lot,which has a certain impact on the experimental results.For the purpose of eliminating the influence of different coordinate systems,the coordinate origin is changed to the palm center,as shown in Fig.4(a).Taking the palm of the hand as the plane,the direction from palm center to the root of the middle finger is the positive direction of they-axis.The positive direction of thex-axis is the direction perpendicular to they-axis and to the right.Through the coordinate origin,perpendicular to this plane is thez-axis.

    The positive direction of they-axis in the new coordinate system can be represented by the following vector:

    Similarly,the positive direction of thex-axis in the new coordinate system can be expressed by the following vector:

    And the positive direction of thez-axis in the new coordinate system can be expressed by the following vector:

    The coordinate representation in the new coordinate system is

    Fig.4.Four different types of features extracted from the Leap Motion.

    Fig.5.The coordinate system of the Leap Motion.

    where (,,) represents the new coordinate after coordinate conversion,i=1,2,...,20 represent the points corresponding to the finger joints.Through the above equations,we can get new coordinates with the palm center as the origin of the coordinate system.Because each three-dimensional coordinate is a array of length 3,the actual dimension of the coordinate feature is 3 × 20 = 60.

    2) Length Feature:As shown in Fig.4(b),this feature refers to the length from each fingertip to the center of the palm.The coordinates of the joints collected from the Leap Motion are used to calculate length information.It can be found that the fingertips are the most variable joints,so (31) is used to calculate the distance between the palm center and the fingertips.

    wherei=4,8,12,16,20 represent the points corresponding to the fingertips in Fig.4(b),and the dimension of length feature is 5.

    3) Angle Feature:As shown in Fig.4(c),this feature represents the angle between Proximal phalanx and Intermediate phalanx of each finger (except Thumb),and the angle extracted from the thumb is between the Intermediate phalanx and Distal phalanx.The calculation process is as follows:

    4) Angular Velocity Feature:As shown in Fig.4(d),this feature represents the rate of the joint angle change.As shown in the following equation:

    wheretis the current time,Δtis the time interval,which depends on the Leap Motion’s sampling time.The dimension of the angular velocity feature is 5.

    E.Gesture Prediction

    The method in the previous section produces four different features,and each feature represents some information related to the performed gesture.In this section,the LSTM network[24] used for gesture prediction is described in detail.The internal structure of the LSTM is shown in Fig.6,wherextdenotes the input of the LSTM network andhtdenotes the output of the LSTM network.ftis the forget gate variables,itis the input gate variables,andotis the output gate variables.The subscriptstandt-1 represent the current time and previous time.ctandare the memory cell state and the memory gate,respectively.The notation of σlstmandtanhdenote the sigmoid and hyperbolic activation functions as shown in (36) and (37).

    The relevant parameters of the LSTM can be calculated by the following equations:

    Fig.6.The internal structure of the LSTM.

    Fig.7.Collect gesture paper data by using the Leap Motion.

    where subscripts off,i,o,andcare related to the parameters of the forget gate,input gate,output gate and memory cell.The parametersWf,Wi,Wc,andWodenote the weight matrices of the corresponding subscripts.Similarly,bf,bi,bc,andbopresent the biases corresponding to subscripts of the LSTM network.The notation of * denotes the element-wise product between vectors.

    In the process of data collection,the change of gestures can be divided into three stages,as shown in Fig.7.For a clearer description,we take the process of turning rock into paper as an example to explain these three stages:

    1) The Original Stage:As shown in Figs.7(a) and 7(b),the gestures at this stage are close to the original state,that is,the gesture is similar to a rock.

    2) The Intermediate Stage:As shown in Figs.7(c) and 7(d),the gestures at this stage change significantly compared to the original stage,that is,the five fingers clearly show different degrees of openness.

    3) The Completion Stage:As shown in Figs.7(e) and 7(f),the gestures at this stage are close to the completion,that is,the gestures tend to being paper.

    Since different players perform actions at different speeds,each action contains 2–6 frames.For the purpose of uniformity,theTof LSTM is set to 4,that is,4 frames of data are input into the LSTM network for prediction.This process is shown in Fig.8.The input layer of the LSTM network is features obtained by the Leap Motion.These features are the coordinate,length,angle and angular velocity calculated from Section II-D,and their dimensions are 60,5,5,and 5,respectively.In addition,the hidden layer of the LSTM network contains 100 nodes.The output of the LSTM network is the result of gesture prediction with the dimension of 3,that is,rock,scissors and paper.With the LSTM network,we can predict the gestures accurately,and the classification results will be sent to a social robot for interaction and reaction.

    Fig.8.The process of the LSTM for predicting gestures.

    III.EXPERIMENT

    A.Experimental Setup

    In this section,the performance and efficiency of the proposed framework are tested.The experiments were carried out on a laptop with an Intel Core i5-6200U CPU.The dynamic gestures of rock,paper and scissors are collected from five different players,and each player repeats each gesture 300 times at fast,medium,and slow speeds,for a total of 4500 different data samples.The experimental results of the network trained by the four features and their combination are compared.

    B.Kalman Filter

    In Section II-C,the Kalman filter is introduced in detail.In this section,it is verified by an experiment,and the measured position is directly obtained by the Leap Motion.The Kalman filter is used to process the original coordinate data to make the processed data closer to the real value.As can be seen from Fig.9,the processed data is much smoother.

    Fig.9.Data processed by Kalman filter.

    C.Experimental Result

    According to the description in Section II-D,we extract the three-dimensional coordinates feature,length feature,angle feature,and angular velocity feature from the filtered data,and train them.Figs.10 and 11 show the accuracy of features using the classification algorithm of Section II-E.

    The three-dimensional positions of the finger joints show that the accuracy of gesture prediction is 97.93%.The length feature and the angle feature have an accuracy of 95.17% and 93.79%,respectively.The angular velocity feature has lower performance,it has an accuracy of 79.31%.It is affected by the speed of the player’s movement,so it is not fast enough to make an accurate prediction.

    The combination of multiple features could enrich the input of the neural network.In some cases,it maybe improve the performance of the prediction.As can be seen from Fig.11,the combination of coordinate features,length features and angle features achieve the highest accuracy of 99.31%,better than any of the three features alone.These results suggest that different features can represent different attributes of the hand and include complementary information.

    Fig.10.The experimental results of four features.

    Fig.11.The experimental results of the combination of four features.

    We examine whether the proposed method is able to achieve real-time gesture recognition and prediction.As shown in Figs.12 and 13,it is obvious that the method proposed in this work can predict the gesture of the fingerguessing game very well.For example,when the player’s gesture changes from rock to paper,the proposed method can predict that the player’s gesture is paper before all fingers are fully open.In addition,we also verify the prediction results of the proposed method from different angles of the Leap Motion to the hand,as shown in Fig.14.

    D.Application

    Fig.12.The prediction process of turning rock into paper.

    Fig.13.The prediction process of turning rock into scissors.

    Fig.14.Predicted results from different angles of the Leap Motion to hand.

    In order to further prove the effectiveness of the proposed method,the trained network is applied to the humanoid robot NAO,as shown in Fig.15.The NAO is an autonomous,programmable humanoid robot which is designed by Aldebaran Robotics [32].The height of the NAO is 573.2 mm and the weight of it is 4.5 kg.It has two cameras,voice recognition,voice synthesis and powered by LiPo Battery.What’s more,it consists of four microphones,two sonar emitters and receivers,two IR emitters and receivers,and three tactile sensors on the top of head.

    In this work,we mainly use the NAO robot’s left hand to play the finger-guessing game with the player.As shown in Fig.16,the NAO robot has only three fingers,and they are linked.Therefore,we first define that the full opening of the robot fingers is paper,the half opening of the robot fingers is scissors,and the clenched fingers are rock.

    Then,the trained model is applied to the NAO robot,and the experimental results are shown in Fig.17.The Leap Motion is used to predict gestures,and then the computer sends the results to the NAO robot,so that the NAO robot can win or lose the game through some simple judgments.

    Fig.17.The experimental results with the NAO robot.

    IV.CONCLUSION

    Fig.15.Experimental equipment and platform.

    Fig.16.The NAO robot and rock-paper-scissors gesture.

    In this paper,a gesture prediction framework based on the Leap Motion is proposed.In the process of data acquisition by the Leap Motion,some jumps or jitters maybe occur.Therefore,the Kalman filter is used to solve these problems.Then,based on the original coordinate features collected by the Leap Motion,we extract three new features,namely,the length feature,angle feature and angular velocity feature.The LSTM network is used to train the model for gesture prediction.In addition,the trained model is applied to the NAO robot to verify the real-time and effectiveness of the proposed method.

    欧美丝袜亚洲另类 | 久久国产乱子伦精品免费另类| 国产又爽黄色视频| av在线天堂中文字幕 | 国产精品亚洲一级av第二区| 日韩欧美一区二区三区在线观看| 日本免费a在线| 久久久国产成人免费| 成人影院久久| 精品福利观看| 一a级毛片在线观看| 欧美日韩亚洲综合一区二区三区_| 中文字幕人妻熟女乱码| 国产精品野战在线观看 | 精品国产一区二区久久| 亚洲精品av麻豆狂野| 国产亚洲av高清不卡| 啦啦啦免费观看视频1| 大陆偷拍与自拍| 日本一区二区免费在线视频| 国产日韩一区二区三区精品不卡| 午夜两性在线视频| 岛国视频午夜一区免费看| a级毛片在线看网站| 女人精品久久久久毛片| 日韩精品中文字幕看吧| 国产一区在线观看成人免费| 免费搜索国产男女视频| 黄色丝袜av网址大全| 高清欧美精品videossex| 日韩欧美国产一区二区入口| 国产精品自产拍在线观看55亚洲| 成人国语在线视频| 热99国产精品久久久久久7| 免费看a级黄色片| videosex国产| aaaaa片日本免费| 日日夜夜操网爽| 999久久久精品免费观看国产| 黄色女人牲交| 国产精品偷伦视频观看了| 在线播放国产精品三级| 欧美精品亚洲一区二区| 国产91精品成人一区二区三区| 亚洲五月天丁香| 成人亚洲精品av一区二区 | 丝袜人妻中文字幕| 亚洲在线自拍视频| 999久久久精品免费观看国产| 欧美黄色片欧美黄色片| 99在线视频只有这里精品首页| 我的亚洲天堂| 老熟妇仑乱视频hdxx| 中文字幕人妻丝袜一区二区| 在线观看免费午夜福利视频| 香蕉久久夜色| 免费看十八禁软件| 久久午夜综合久久蜜桃| 亚洲中文字幕日韩| 久久香蕉国产精品| 欧美色视频一区免费| 黄片播放在线免费| 成人av一区二区三区在线看| 国产成人精品在线电影| 亚洲免费av在线视频| 午夜91福利影院| 神马国产精品三级电影在线观看 | 欧美乱码精品一区二区三区| 国产又爽黄色视频| 久久久久九九精品影院| 久久人人爽av亚洲精品天堂| 在线观看免费视频日本深夜| 天堂动漫精品| 三级毛片av免费| 美女福利国产在线| 色在线成人网| 一级片免费观看大全| 日韩欧美在线二视频| 成年版毛片免费区| av视频免费观看在线观看| 一个人观看的视频www高清免费观看 | 午夜福利,免费看| 一个人免费在线观看的高清视频| 国产av又大| 淫妇啪啪啪对白视频| 精品国产超薄肉色丝袜足j| 亚洲,欧美精品.| 成人黄色视频免费在线看| 久久青草综合色| 国产精品综合久久久久久久免费 | 一夜夜www| 中文字幕色久视频| 久久国产精品影院| 日本黄色日本黄色录像| 国产精品秋霞免费鲁丝片| 色综合婷婷激情| 91成人精品电影| 中文字幕人妻丝袜制服| 国产激情久久老熟女| 午夜激情av网站| 国产精品久久久人人做人人爽| 在线观看午夜福利视频| 日日夜夜操网爽| 日韩欧美国产一区二区入口| 精品福利永久在线观看| 黄色丝袜av网址大全| 亚洲精品在线美女| 一个人免费在线观看的高清视频| 国产激情久久老熟女| 一级黄色大片毛片| 成人免费观看视频高清| 狂野欧美激情性xxxx| 757午夜福利合集在线观看| 日本 av在线| 1024视频免费在线观看| 日韩大尺度精品在线看网址 | 久久久久久久午夜电影 | 久久精品国产亚洲av香蕉五月| 久久精品91无色码中文字幕| 脱女人内裤的视频| 久久人人精品亚洲av| 午夜亚洲福利在线播放| 成人18禁高潮啪啪吃奶动态图| 黄色 视频免费看| av有码第一页| 久久精品国产99精品国产亚洲性色 | 人妻丰满熟妇av一区二区三区| 伊人久久大香线蕉亚洲五| 成在线人永久免费视频| 多毛熟女@视频| 亚洲中文字幕日韩| 黑丝袜美女国产一区| www国产在线视频色| 欧美不卡视频在线免费观看 | 中文字幕av电影在线播放| 波多野结衣高清无吗| 精品国产乱码久久久久久男人| 国产亚洲欧美精品永久| 999精品在线视频| 90打野战视频偷拍视频| 国产精品 欧美亚洲| 亚洲熟女毛片儿| 91精品三级在线观看| 极品人妻少妇av视频| 亚洲va日本ⅴa欧美va伊人久久| 国产免费av片在线观看野外av| 大香蕉久久成人网| 亚洲欧美一区二区三区久久| 两个人免费观看高清视频| 国产色视频综合| 久久国产亚洲av麻豆专区| 无遮挡黄片免费观看| 人成视频在线观看免费观看| 欧美最黄视频在线播放免费 | 久久香蕉激情| 亚洲国产精品合色在线| 国产91精品成人一区二区三区| 免费在线观看黄色视频的| 国产亚洲精品一区二区www| 99re在线观看精品视频| 免费在线观看黄色视频的| 国产免费男女视频| 欧美黑人欧美精品刺激| 美女 人体艺术 gogo| 热99国产精品久久久久久7| 亚洲精品一二三| 亚洲人成电影免费在线| 亚洲色图av天堂| 91精品三级在线观看| 操美女的视频在线观看| 欧美成人午夜精品| 99精国产麻豆久久婷婷| 精品国产乱子伦一区二区三区| 日韩欧美国产一区二区入口| 国产精华一区二区三区| 欧美 亚洲 国产 日韩一| 他把我摸到了高潮在线观看| 性少妇av在线| 亚洲精品国产区一区二| а√天堂www在线а√下载| 国产人伦9x9x在线观看| 亚洲国产精品sss在线观看 | 一级黄色大片毛片| 99国产综合亚洲精品| 法律面前人人平等表现在哪些方面| 亚洲一区二区三区色噜噜 | 淫妇啪啪啪对白视频| av中文乱码字幕在线| 亚洲一码二码三码区别大吗| 精品国产乱子伦一区二区三区| 久久久国产欧美日韩av| 国产av又大| 欧美 亚洲 国产 日韩一| 99热只有精品国产| 亚洲成国产人片在线观看| 一级黄色大片毛片| 91av网站免费观看| 窝窝影院91人妻| 丁香欧美五月| 操出白浆在线播放| 精品少妇一区二区三区视频日本电影| 淫秽高清视频在线观看| 伦理电影免费视频| 亚洲美女黄片视频| 久9热在线精品视频| 欧美乱色亚洲激情| 成年人免费黄色播放视频| 国产精品亚洲av一区麻豆| 日韩中文字幕欧美一区二区| 69av精品久久久久久| 99精品在免费线老司机午夜| av视频免费观看在线观看| 日本wwww免费看| 欧美日韩av久久| 久久久久久久午夜电影 | 亚洲中文av在线| 午夜久久久在线观看| 午夜视频精品福利| 免费看十八禁软件| 欧美日韩瑟瑟在线播放| 91成年电影在线观看| 老熟妇仑乱视频hdxx| 99国产综合亚洲精品| 丰满迷人的少妇在线观看| 91成人精品电影| 中文欧美无线码| 国产成+人综合+亚洲专区| 欧美激情 高清一区二区三区| 欧美黄色淫秽网站| 在线观看www视频免费| 亚洲九九香蕉| 中文字幕av电影在线播放| avwww免费| 国产精品美女特级片免费视频播放器 | 九色亚洲精品在线播放| 欧美日韩乱码在线| 一区二区三区国产精品乱码| 美女高潮喷水抽搐中文字幕| 中文字幕色久视频| 人人妻,人人澡人人爽秒播| 亚洲一区二区三区欧美精品| 日韩国内少妇激情av| 男女床上黄色一级片免费看| svipshipincom国产片| 日本五十路高清| 高清毛片免费观看视频网站 | 久久精品人人爽人人爽视色| 亚洲情色 制服丝袜| 波多野结衣av一区二区av| 久久午夜亚洲精品久久| 久热爱精品视频在线9| 亚洲人成77777在线视频| 成人影院久久| 两个人免费观看高清视频| av片东京热男人的天堂| 欧美日韩亚洲高清精品| 超色免费av| 老司机深夜福利视频在线观看| 久久中文看片网| 91麻豆av在线| 亚洲少妇的诱惑av| 亚洲人成电影免费在线| 黑人巨大精品欧美一区二区mp4| 日韩视频一区二区在线观看| 在线观看免费高清a一片| 另类亚洲欧美激情| www.精华液| 国产三级在线视频| 精品久久久久久电影网| 12—13女人毛片做爰片一| 久久久久久久久久久久大奶| 两性午夜刺激爽爽歪歪视频在线观看 | 午夜久久久在线观看| 日本五十路高清| 一区在线观看完整版| 激情在线观看视频在线高清| 午夜亚洲福利在线播放| 好看av亚洲va欧美ⅴa在| 久久久水蜜桃国产精品网| 欧洲精品卡2卡3卡4卡5卡区| 国产成人系列免费观看| aaaaa片日本免费| 少妇粗大呻吟视频| 久久精品亚洲av国产电影网| 亚洲熟女毛片儿| 人成视频在线观看免费观看| 岛国在线观看网站| 男女之事视频高清在线观看| 欧美乱色亚洲激情| 国产有黄有色有爽视频| 国产熟女xx| 天堂动漫精品| 50天的宝宝边吃奶边哭怎么回事| 欧美午夜高清在线| 日韩欧美在线二视频| 波多野结衣av一区二区av| 黄色片一级片一级黄色片| 亚洲人成网站在线播放欧美日韩| 色精品久久人妻99蜜桃| 手机成人av网站| 国产av精品麻豆| 人妻丰满熟妇av一区二区三区| 国产精品国产av在线观看| www国产在线视频色| 国产精品影院久久| 免费在线观看亚洲国产| 亚洲精品在线观看二区| 午夜老司机福利片| 午夜精品国产一区二区电影| 欧美日韩视频精品一区| 国产精品日韩av在线免费观看 | 亚洲欧美日韩另类电影网站| 免费搜索国产男女视频| 欧美日韩av久久| 亚洲精品中文字幕一二三四区| 亚洲欧美一区二区三区久久| 亚洲av日韩精品久久久久久密| 国产在线精品亚洲第一网站| 宅男免费午夜| 多毛熟女@视频| 精品一区二区三卡| 亚洲avbb在线观看| 巨乳人妻的诱惑在线观看| 最新在线观看一区二区三区| 亚洲少妇的诱惑av| 女人精品久久久久毛片| 精品久久久久久,| 99久久国产精品久久久| 99久久人妻综合| 69精品国产乱码久久久| 午夜视频精品福利| 女警被强在线播放| 韩国精品一区二区三区| 亚洲精华国产精华精| 不卡av一区二区三区| 免费高清在线观看日韩| 精品久久久精品久久久| 俄罗斯特黄特色一大片| 高清欧美精品videossex| 叶爱在线成人免费视频播放| 国产免费男女视频| 狠狠狠狠99中文字幕| 高清欧美精品videossex| 99国产综合亚洲精品| 色播在线永久视频| 丝袜美腿诱惑在线| 9191精品国产免费久久| 免费在线观看影片大全网站| 国产亚洲欧美在线一区二区| 欧美中文日本在线观看视频| 欧美成人午夜精品| 久久人妻福利社区极品人妻图片| 无限看片的www在线观看| 露出奶头的视频| 丁香六月欧美| 黄色视频不卡| 黄色毛片三级朝国网站| 精品国产美女av久久久久小说| 亚洲精品国产一区二区精华液| 亚洲国产精品999在线| a级毛片在线看网站| 国产精品98久久久久久宅男小说| 这个男人来自地球电影免费观看| 夜夜看夜夜爽夜夜摸 | 99国产精品一区二区三区| 国产亚洲精品久久久久5区| 久久精品国产综合久久久| 国产精品日韩av在线免费观看 | 欧美+亚洲+日韩+国产| 国产国语露脸激情在线看| 欧美日韩亚洲高清精品| 久久这里只有精品19| 国产深夜福利视频在线观看| 夜夜夜夜夜久久久久| 波多野结衣高清无吗| 亚洲人成伊人成综合网2020| 久久国产乱子伦精品免费另类| 亚洲一区高清亚洲精品| 国产精品影院久久| 中文字幕色久视频| 国产高清激情床上av| 国产精品久久电影中文字幕| 国产免费av片在线观看野外av| 丰满人妻熟妇乱又伦精品不卡| 国产高清激情床上av| 免费av中文字幕在线| 久久亚洲精品不卡| 伦理电影免费视频| 亚洲精品美女久久久久99蜜臀| 国产精品一区二区在线不卡| 国产成年人精品一区二区 | 中文字幕人妻丝袜制服| 亚洲 欧美一区二区三区| 亚洲久久久国产精品| 我的亚洲天堂| 成人亚洲精品一区在线观看| 在线观看舔阴道视频| 天天添夜夜摸| 欧美日韩瑟瑟在线播放| 亚洲一区二区三区欧美精品| 亚洲精品国产区一区二| 国产成人一区二区三区免费视频网站| 亚洲自拍偷在线| 女人被狂操c到高潮| 99久久人妻综合| 最新在线观看一区二区三区| avwww免费| 在线播放国产精品三级| 国产精品一区二区三区四区久久 | 日韩大码丰满熟妇| 国产无遮挡羞羞视频在线观看| 制服诱惑二区| ponron亚洲| 极品人妻少妇av视频| 久久国产乱子伦精品免费另类| 欧美成人免费av一区二区三区| 国产xxxxx性猛交| 欧美中文综合在线视频| 宅男免费午夜| 久久 成人 亚洲| 国产免费现黄频在线看| 免费在线观看视频国产中文字幕亚洲| 一级片免费观看大全| 亚洲三区欧美一区| 别揉我奶头~嗯~啊~动态视频| 精品国产亚洲在线| 一级,二级,三级黄色视频| 久热爱精品视频在线9| 成人手机av| 黄色 视频免费看| 国产成人av教育| 久久人人97超碰香蕉20202| 桃色一区二区三区在线观看| 亚洲片人在线观看| 久久久久久久久免费视频了| 91成年电影在线观看| 日日爽夜夜爽网站| 午夜免费观看网址| 免费不卡黄色视频| 免费日韩欧美在线观看| 欧美激情极品国产一区二区三区| 丰满人妻熟妇乱又伦精品不卡| 女人高潮潮喷娇喘18禁视频| 久久性视频一级片| 日韩国内少妇激情av| 精品久久久久久久毛片微露脸| 50天的宝宝边吃奶边哭怎么回事| 亚洲国产毛片av蜜桃av| 精品久久久久久电影网| 欧美日韩亚洲高清精品| 久久九九热精品免费| 久久国产精品影院| 免费人成视频x8x8入口观看| 香蕉久久夜色| 亚洲狠狠婷婷综合久久图片| xxxhd国产人妻xxx| 欧美午夜高清在线| 亚洲男人的天堂狠狠| 丁香欧美五月| 国产成人欧美在线观看| 村上凉子中文字幕在线| 日韩一卡2卡3卡4卡2021年| 亚洲成人久久性| 麻豆国产av国片精品| 少妇粗大呻吟视频| 可以免费在线观看a视频的电影网站| 久久精品国产亚洲av香蕉五月| 黄色a级毛片大全视频| 亚洲av日韩精品久久久久久密| 看片在线看免费视频| 久久欧美精品欧美久久欧美| 自拍欧美九色日韩亚洲蝌蚪91| 国产蜜桃级精品一区二区三区| 精品国产乱子伦一区二区三区| 日韩视频一区二区在线观看| 亚洲午夜精品一区,二区,三区| 日韩高清综合在线| 日本免费一区二区三区高清不卡 | 一本综合久久免费| 久久久久国内视频| 日本免费一区二区三区高清不卡 | 亚洲国产精品合色在线| 久久九九热精品免费| 神马国产精品三级电影在线观看 | 亚洲熟妇中文字幕五十中出 | 在线观看午夜福利视频| 亚洲 国产 在线| 欧美中文日本在线观看视频| a级片在线免费高清观看视频| 国产精品野战在线观看 | 国产无遮挡羞羞视频在线观看| 老司机午夜十八禁免费视频| 两个人免费观看高清视频| 免费在线观看日本一区| 国产有黄有色有爽视频| 午夜两性在线视频| 日本撒尿小便嘘嘘汇集6| 国产高清激情床上av| 欧美最黄视频在线播放免费 | 在线国产一区二区在线| av超薄肉色丝袜交足视频| 身体一侧抽搐| 在线av久久热| 少妇被粗大的猛进出69影院| 亚洲精品av麻豆狂野| 亚洲国产精品一区二区三区在线| 国产一区二区三区视频了| 久久久久久亚洲精品国产蜜桃av| 91麻豆精品激情在线观看国产 | 母亲3免费完整高清在线观看| 亚洲国产欧美日韩在线播放| 两人在一起打扑克的视频| 久久精品aⅴ一区二区三区四区| 99精品在免费线老司机午夜| 国产成人精品久久二区二区91| 亚洲精品中文字幕在线视频| 中出人妻视频一区二区| www.999成人在线观看| 韩国av一区二区三区四区| 亚洲国产欧美一区二区综合| 国产高清视频在线播放一区| 欧美另类亚洲清纯唯美| 亚洲av成人av| 国产人伦9x9x在线观看| 久久久久久久午夜电影 | 一区在线观看完整版| 欧美av亚洲av综合av国产av| 91老司机精品| 51午夜福利影视在线观看| 日韩中文字幕欧美一区二区| 国产区一区二久久| 亚洲,欧美精品.| 午夜福利影视在线免费观看| 青草久久国产| 丁香欧美五月| 欧美最黄视频在线播放免费 | 咕卡用的链子| 在线观看免费日韩欧美大片| 久久国产乱子伦精品免费另类| 黑人欧美特级aaaaaa片| 午夜亚洲福利在线播放| 色哟哟哟哟哟哟| 久久精品国产99精品国产亚洲性色 | 亚洲成人免费av在线播放| 国产人伦9x9x在线观看| 丝袜美腿诱惑在线| 激情视频va一区二区三区| av片东京热男人的天堂| 亚洲av美国av| 老司机亚洲免费影院| 亚洲色图 男人天堂 中文字幕| 国产又爽黄色视频| 免费在线观看日本一区| 亚洲av电影在线进入| 免费在线观看视频国产中文字幕亚洲| 国产精品久久视频播放| 一进一出抽搐gif免费好疼 | 欧美日本亚洲视频在线播放| 自拍欧美九色日韩亚洲蝌蚪91| 19禁男女啪啪无遮挡网站| 国产乱人伦免费视频| 国产极品粉嫩免费观看在线| 午夜福利一区二区在线看| 一级毛片高清免费大全| 亚洲五月天丁香| 国产av在哪里看| 9色porny在线观看| 在线看a的网站| 成人av一区二区三区在线看| 欧洲精品卡2卡3卡4卡5卡区| 嫩草影院精品99| 一边摸一边做爽爽视频免费| 久久久久久亚洲精品国产蜜桃av| 99国产精品免费福利视频| 美女国产高潮福利片在线看| 久久久精品国产亚洲av高清涩受| 国产精品久久久人人做人人爽| 精品久久久精品久久久| 国产av又大| 日本 av在线| 午夜91福利影院| 香蕉国产在线看| 午夜a级毛片| 91在线观看av| 后天国语完整版免费观看| 在线观看一区二区三区| 精品卡一卡二卡四卡免费| 在线观看免费日韩欧美大片| 久久国产精品男人的天堂亚洲| 黄色a级毛片大全视频| 亚洲aⅴ乱码一区二区在线播放 | 日韩欧美在线二视频| 大陆偷拍与自拍| 欧美亚洲日本最大视频资源| 欧洲精品卡2卡3卡4卡5卡区| 久久中文字幕人妻熟女| 国产99白浆流出| 午夜久久久在线观看| 黄色视频,在线免费观看| 999精品在线视频| 成人18禁在线播放| 少妇裸体淫交视频免费看高清 | 中国美女看黄片| 天天躁狠狠躁夜夜躁狠狠躁| 丁香六月欧美| 午夜免费激情av| 亚洲国产精品999在线| 免费一级毛片在线播放高清视频 | 91大片在线观看| 高清黄色对白视频在线免费看| 波多野结衣高清无吗| 50天的宝宝边吃奶边哭怎么回事| 久久亚洲真实| 日韩成人在线观看一区二区三区| 国产精品野战在线观看 |