• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Service Robot Localization Based on Global Vision and Stereo Vision

    2012-02-07 07:46:48YUQingxiao于清曉YANWeixin閆維新FUZhuangZHAOYanzheng趙言正

    YU Qing-xiao(于清曉),YAN Wei-xin(閆維新),F(xiàn)U Zhuang(付 莊),ZHAO Yan-zheng(趙言正)*

    1 State Key Laboratory of Mechanical System and Vibration,Shanghai Jiaotong University,Shanghai 200240,China

    2 State Key Laboratory of Robotics and System,Harbin Institute of Technology,Harbin 150001,China

    Introduction

    Mobile robot localization and object pose estimation in a working environment have been central research activities in mobile robotics.Solution of mobile robot localization needs to address two main problems[1]:the robotmusthave a representation of the environment;the robot must have a representation of its belief regarding its pose in this environment.Sensors are the basis of addressing both problems.Many of the sensors available to mobile robots are introduced in Ref.[1],giving their basic principles and performance limitations.Based on this introduction,ultrasonic sensors[2,3],goniometers[4],laser range finders[5,6],and charge coupled device(CCD)cameras[7,8]are sensors which are commonly applied in mobile robot localization projects to gather high precision information on the robot's pose estimation.

    Most mobile robot localization approaches can be widely classified into three major types:metric,topological,and hybrid.Metric approaches[9,10]are useful when it is necessary for the robot to know its position accurately in terms of metric coordinates.And,the state of the robot can also be represented in a more qualitative manner,by using a topological map[11,12].Butthese methods confrontmany difficulties when the environments are changed.A vision-based localization and mapping algorithm which used scale-invariant image feature(STIF)was applied for mobile robot localization and map building in Ref.[13].Miro et al.[14]used the binocular vision system to generate the disparity map by matching feature and realized the self-localization and self-navigation.Grzyb et al.[15]adopted stereo images and biologically inspired algorithms to accurately estimate the pose,size,and shape of the target object.Greggio et al.[16]applied stereo cameras and a real-time tracking algorithm to realize pattern recognition and evaluate the 3D position of a generic spatial object.Chinellato et al.[17]proposed a visual analysis which was based on a computational model of distance and orientation estimation inspired by human visual mechanisms to extract the features and pose of the object.

    Considering the positioning cost and computational efficiency,a hierarchical localization method based on vision is proposed to provide the absolute coordinates in this paper.The proposed algorithm can estimate the real-time coordinates well,and obtain up to±2 cm positioning accuracy to make sure that the robot can successfully grab the object.

    1 Hierarchical Positioning Method

    1.1 Global vision-based coarse localization

    The estimation of the mobile robot's pose is a fundamental problem,which can be roughly divided into two classes[13]:methods for keeping track of the robot's pose and methods for global pose estimation.Most ofthe present research has concentrated on the first class,which assumes that the initial pose of the robot is known[18].In this study,both methods are used to position the mobile robot in the initial stage.The global vision-based localization method is used to calculate the initial position oftherobotand periodically update itscurrent coordinates,and the dead-reckoning method is also used to localize the robot as a supplementary method.In the following part,the global vision-based localization method is explained in detail.

    One monocular camera mounted on the ceiling is used to acquire and transmit the global image to the robot through the wireless network.The image is firstly processed to remove the image distortion.Then,color segmentation method is used to position the robot in the image and distinguish the color mark on the head in the Hue-Saturation-Value(HSV)color space.After distinguishing and extracting the color mark,the coordinates of the color mark can be obtained to calculate the coordinates of the robot,according to the global vision model shown in Fig.1.

    In Fig.1,(0,0)denotes the origin of the camera coordinate system;HCdenotes the height of the camera relative to the floor;HRdenotes the height of the robot;(XRC,YRC)is the center coordinate of the robot;(XHC,YHC)is the center coordinate of the color mark.

    According to the geometric relationship,the following equation can be obtained:

    Then,the following results can be calculated as:

    Fig.1 The global vision camera model

    In Eq.(2),we know that(XRC,YRC)is dependent on the values of HC,HR,and(XHC,YHC).But the values of HCand HRare invariant in this study,and(XRC,YRC)is only dependent on(XHC,YHC).

    Four coordinate systems:global coordinate system(GCS),robot coordinate system(RCS),sensor coordinate system (SCS),and virtualglobalcoordinate system(VGCS)[19]are all under consideration.So,the appropriate transformation between VGCS and SCS should be acquired to calculate the absolute coordinates of the robot.According to the geometric relationship in Fig. 1, the homogeneous transformation matrix T between OXYZ and O′X′Y′Z′can be obtained as:

    Then,the coordinates of the robot in VGCS can be calculated as:

    In the study,VGCS is coincident with GCS in addition to different scale.So,the absolute coordinates of the robot can be achieved as:

    where Kheightand Kwidthare the relative scales to the VGCS;m and n are shift distances of VGCS relative to SCS.

    In this way,the robot can know its initial position wherever it is placed and periodically update its current coordinates.Furthermore,the real-time positioning of the robot can be realized in indoor environment with the help of the deadreckoning method.

    1.2 Stereo vision-based precise localization

    When the service robot moves into the area where the object is placed under the guidance of the global vison,the binocular vision-based localization method is adopted to provide high positioning accuracy to successfully grab the object for the customer.In this study,color segmentation and shape-based matching[20]are separately applied to detect the object,and the extracted results are fused to obtain the accurate object.The color segmentation method has been discussed.Below,we describe the shape-based matching method.

    The advantage ofshape-based matching isitsgreat robustness and flexibility.Instead of using the gray values,features along contours are extracted and used for the model generation and matching.The method is invariant to changes in illumination and variations of object gray values.The method also allows the object to be rotated and scaled.The process of shape-based matching is divided into two distinct phases:the training phase and the recognition phase.In the first phase,the template model should be created and trained.In the second phase,the model is used to find and localize the object in the image.The basic concept of shape-based image matching is shown in Fig.2.

    Fig.2 The basic concept of shape-based image matching

    Using the mentioned mehods,the regions of interest(ROI)are seperately obtained. Then,the following information fusion method is adopted to find the expected region of the object.Suppose that image regions A are obtained using color segmentation method and image regions B are also obtained using shape-based matching method.The center points of image regions A and B are all extracted and compared one by one.If the difference between the center coordinates of one A image region and the center coordinates of one B image region is less than the default constant ε,the image area belongs to the expected regions of interest.Otherwise,the image area is the error detected region.Then,the expected regions of interest can be obtained as:

    where(xi,yi)is the center coordinates of the sub-region of the image regions A;(xj,yj)is the center coordinates of the j subregion of the image regions B.

    According to Eq.(6),the expected regions can be obtained to find and match the object.After obtaining the plate region,the normalized cross-correlation(NCC)matching method[21]is adopted to obtain the depth image.Then,the 3D coordinates of the color mark,which is on the object,can be calculated using the binocular vision system.To localize the mobile robot,the homogeneous transformation matrix between binocular coordinate system and RCS should be firstly obtained.Figure 3 shows the relationship between binocular coordinate system and RCS.

    Fig.3 The diagram of binocular coordinate system and RCS

    In Fig.3,α and θ are the rotation degrees of the head;O′X′Y″Z denotesRCS,and O″X″Y″Z″ denotesbinocular coordinate system.When α and θ are equal to zero,O″X″Y″Z″shifts negative 6 cm in the X-axis direction relative to O′X′Y′Z′.According to these information, the homogeneous transformation matrix T between O′X′Y′Z′and O″X″Y″Z″can be got as:

    Then,the coordinates of the color mark in RCS can be calculated as :

    where[x′ y′ z′]Tis the location of the color mark in RCS;[x″ y″ z″]Tis the location of the color mark in binocular coordinate system.

    In addition to obtain the 3D coordinates of the color mark,the angle of the object relative to the robot should be also calculated by extracting and filtering the edge lines of the object.Detailed process is as follows.Firstly,the region of the object is processed using the expansion algorithm and canny operator.Then,the extracted edge lines are filtered using the threshold filter to obtain the available edge lines which can denote the angle of the object.Then,the selected edge lines are filtered using the mean filter,which can be used to calculate the angle of the object.Finally,the angle of the object relative to the robot,φobject,can be reckoned as follows.

    where φ denotes the detected angle of the object in the image and θ is the rotation angle of the neck joint.

    After determining the position of the color mark,the robot can adjust the velocity and heading angle to move towards the object under the guidance of the location-based visual servo control system.Periodically localize the position of the color mark using the method above and adjust the pose of the robot until the robot successfully realizes its precise positioning relative to the color mark.The location-based visual servo control system is shown in Fig.4.

    Fig.4 The chart of the location-based visual servo control system

    2 Experiments and Result Analysis

    2.1 Experiment setup

    The experimental platform is a restaurant robot as shown in Fig.5 which is equipped with one binocular camera.The model of the binular camera is bumblebee2 which is made by Point Grey Research.There is a global camera mounted on the ceiling which is used to obtain the whole environmental image.The global images with 768×576 pixels are sent to the mobile robot through the wirless ethernet.The binocular camera mounted on the robot head can send the stereo images with 512×384 piexls to the robot through 1 394 bus.Before the experiment,the binocular camera has been calibrated in advance.

    The complete movement of the service robot can be divided into two parts:coarse localization and precise localization.In the following parts,we explain the experimental results of the complete movement from the two aspects.

    Fig.5 The photo of the restaurant service robot

    2.2 Global vision-based coarse localization

    In this part,the global vision localization method can be explained through the following example in this study.The mobile robot receives the image of the restaurant from the global camera through the wirless ethernet.To acquire the available value from the image,we should firstly rectify the raw image.Figure 6 is the rectified global image obtained from the camera.To localize the robot in the image,we should find the ROI and detect the color mark on the robot head using the color segmentation method.The result of the image processing is shown in Fig.7.

    After the location of the robot is detected in the image,the location of the robot relative to the pinhole camera coordinate system should be calculated.We have known the size of the picture in advance.And the height of the camera relative to the floor and the height of the robot are invariant and also known in advance.Because the position and the parameters of the global camera are invariant,the scales of the pixel relative to the one centimeter in the X-axis and Y-axis directions can be acquired.These parameters which are used to calculate the location of the robot,are all shown in Table 1.

    Table 1 The used parameters in this experiment

    In Fig.7,the origin of the picture coordinate system is in the top left-hand corner of the picture.And the coordinates of the color mark in the picture coordinate system have been calculated.According to Fig.1,we obtain the transformation coordinates,(X'HC,Y'HC),in the pinhole coordinate system.Then,the result can be calculated as(-68.50,- 311.33).After(X'HC,Y'HC)is obtained,the robot can be positioned in the image as(-39.14,-177.90).

    Next,the real coordinates of the robot in GCS should be reckoned.The origin of GCS is in the top right-hand of the carpet array shown in Fig.7.And,the coordinates of the top right-hand of the carpet array is(216,237)in the pinhole coordinate system.So,the coordinates of the robot in GCS is calculated as

    Many experiment results show that the accuracy of global vision positioning is within ±7 cm.In this study,the coordinates of the robot is updated every 100 ms to make sure that the robot can obtain its real-time coordinates.

    2.3 Stereo vision-based precise localization

    When the robot moves into the area where the object is placed under the guidance of the global vison,the stereo visionbased localization method is adopted toprovide the high positioning accuracy.The acquired images from binocular camera are firstly rectified using image correction algorithm to prepare the available images for the next process.The rectified image from the stereo vision during the robot movement is shown in Fig.8.

    Fig.8 The rectified stereo vision image

    In this study,the image taken by the right camera is used as the reference image where the ROI is extracted to realize the object feature matching and obtain the disparity map of the object.The object region is detected and extracted using color segmentation and shape-based matching methods.The procedure of color segmentation is as follows.Firstly,HSV color spaces could be obtained through the image color space conversion.Then,the appropriate thresholds are set in the Hue and Saturation space components.Finally,the plate region is extracted from the reference image.And the plate region can be also found in the image using the shape-based matching method.Each center coordinates of the extracted regions of interest can be firstly extracted,then,the expected region can be obtained according to Eq.(6).The plate region in Fig.9 is the expected region which is obtained according to the methods above.

    Fig.9 The expected region of interest

    Next,we should find the characteristic sign in the expected region and match image using NCC algorithm.In the process,the consistency test between the left and right images is also used to find the accuracy matching.The obtained depth image is shown in Fig.10.

    Fig.10 The obtained depth image using NCC

    From Fig.10,we can know that the depth value of the characteristic sign is 40.07.According to the binocular vision system,the coordinates of the characteristic mark can be calculated as[1.4 0.2 118.9]T.Otherwise,using the encoders,we can know that α =45°and θ=0°.Then,using Eq.(7),the homogeneous transformation matrix T,between RCS and binocular vision coordinate system is denoted as follows:

    Finally,according to Eq.(8),the result can be reckoned as[x′ y′ z′]=[- 4.6 88.5 83.9].Otherwise,the angle of the object relative to the robot should be calculated to determine the position of the color mark.In experiment,firstly,the edge lines of the object are extracted using canny operator.Then,the extracted edge lines are filtered using the threshold filter to find the available lines.Finally,the selected lines are filtered using the mean filter to calculate the angle of the object.The obtained lines using the method above are shown in Fig.11.From Fig.11,we know that the angle of the object is-3.865°.

    Fig.11 The extracted edge lines of the object

    After determining the position of the color mark,the robot can adjust the velocity and heading angle to move towards the object under the guidance of the location-based visual servo control system.And in the process of the movement,the robot should constantly repeat the positioning process above and adjust its pose until the precise positioning is finally realized.

    We performed the whole movement at the velocity of 0.15 m/s for sixty times in our library,and the initial position of the robot was random and unknown each time.And,the service robot could always realize self-localization with the positioning accuracy up to±2 cm and successfully grab the object for the customers within 3 min.The statistical data are shown in Table 2.

    Table 2 The application and positioning accuracy of the hierarchical localization method

    3 Conclusions and Future Work

    In this paper,a method of mobile robot localization which is based on visual feedback is proposed and applied on the restaurant service robot.The whole positioning process can be divided into two stages:coarse positioning and precise positioning.Hierarchical positioningmethod is adopted to provide different positioning accuracies in different stages.In the coarse positioning stage,the robot can periodically obtain its current coordinates using the global vision-based localization method.In the precise positioning stage,the robot can obtain higher localization precision using the binocular vision-based localization method.Finally,the robot can successfully grasp the plate on the table.Many experiments verify that the proposed algorithm has good location effect and obtains up to±2 cm positioning accuracy.

    Currently,research on the improvement in the positioning accuracy and speed of the algorithm is under way.And our future work will also focus on improving the anti-interference ability to the more complex environment.

    [1]Siegwart R,Nourbakhsh I.Introduction to Autonomous Mobile Robots[M].Cambridge:The MIT Press,2004:121-156.

    [2]Choi B S,Lee J J.Mobile Robot Localization Scheme Based on RFID andSonarFusion System [C].IEEE International Symposium on Industrial Electronics,Seoul,Korea,2009:1035-1040.

    [3]Choi B S,Lee J J.Localization of a Mobile Robot Based on an Ultrasonic Sensor Using Dynamic Obstacles[J].Artificial Life and Robotics,2008,12(1/2):280-283.

    [4]Bonnifait P,Garcia G.Design and Experimental Validation of an Odometric and Goniometric Localization System for Outdoor Robot Vehicles[J].IEEE Transactionson Roboticsand Automation,1998,14(4):541-548.

    [5]Balaguer B,Carpin S,Balakirsky S. Towards Quantitative Comparisons of Robot Algorithms:Experiences with SLAM in Simulation and Real World Systems[C].IEEE/RSJ International Conference on Intelligent Robots and Systems,California,USA,2007:1-7.

    [6]Lin H H,Tsai C C.Laser Pose Estimation and Tracking Using Fuzzy Extended Information Filtering for an Autonomous Mobile Robot[J].Journal of Intelligent and Robotic Systems,2008,53(2):119-143.

    [7]Munguia R,Grau A.Monocular SLAM for Visual Odometry[C]. IEEE InternationalSymposium on IntelligentSignal Processing,Madrid,Spain,2007:1-6.

    [8]Lin H Y,Lin J H,Wang M L.A Visual Positioning System for Vehicle or Mobile Robot Navigation[C].Proceedings of the 8th InternationalIEEE Conference on Intelligent Trasportation Systems,Vienna,Austria,2005:73-78.

    [9]Blanco J L,Gonzalez J,F(xiàn)ernandez-Madrigal J A.Consistent Observation Grouping for Generating Metric-Topological Maps that Improves Robot Localization[C].Proceeding of IEEE International Conference on Robotics and Automation,Orlando,USA,2006:818-823.

    [10]Guo Y,Xu X H.Color Landmark Design for Mobile Robot Localization[C].IMACS Multiconference on Computational Engineering in Systems Applications,Beijing,China,2006:1868-1874.

    [11]Kwon T B,Yang J H,Song J B,et al.Efficiency Improvement in Monte Carlo Localization through Topological Information[C].Proceeding of 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems,Beijing,China,2006:424-429.

    [12]Tapus A,Siegwart R.A Cognitive Modeling of Space Using Fingerprints ofPlaces for Mobile Robot Navigation [C].Proceedings of the IEEE International Conference on Robotics and Automation,Orlando,USA,2006:1188-1193.

    [13]Borenstein J,Everett B,F(xiàn)eng L.Navigation Mobile Robot:System and Techniques[M].Wellesley,USA:A.K.Peters Ltd.,1996:103-150.

    [14]Miro J V,Zhou W Z,Dissanayake G.Towards Vision Based Navigation in Large Indoor Environments[C]. IEEE International Conference on Intelligent Robots and Systems,Beijing,China,2006:2096-2102.

    [15]Grzyb B,Chinellato E,Morales A,et al.A 3D Grasping System Based on MultimodalVisualand TactileProcessing [J].Industrial Robot:An International Journal,2009,36(4):365-369.

    [16]Greggio N,Bernardino A,Laschi C,et al.Real-Time 3D Stereo Tracking and Localizing of Spherical Objects with the iCub Robotic Platform [J].Journal of Intelligent and Robotic Systems,2011,63(3/4):417-446.

    [17]Chinellato E,Grzyb B J,del Pobil A P.Brain Mechanisms for Robotic ObjectPose Estimation[C]. InternationalJoint Conference on Neural Networks,Hong Kong,China,2008:3268-3275.

    [18]Lang H X,Wang Y,de Silva C W.Mobile Robot Localization and Object Pose Estimation Using Optical Encoder,Vision and Laser Sensors[C].Proceedings of the IEEE International Conference on Automation and Logistics,Qingdao,China,2008:617-622.

    [19]Yu Q X,Yuan C,F(xiàn)u Z,et al.Research of the Localization of Restaurant Service Robot[J].International Journal of Advanced Robotic Systems,2010,7(3):227-238.

    [20]El Munim H A,F(xiàn)arag A A. Shape Representation and Registration Using Vector Distance Functions[C].Proceedings of theIEEE Conference on ComputerVision and Pattern Recognition,Minneapolis,USA,2007:1-8.

    [21]Sun Z X,Wu Q.TS201 Based Fast Algorithm of Normalized Cross-Correlation[J].Modern Electronics Technique,2010,33(10):125-127.(in Chinese)

    久久久成人免费电影| 91久久精品电影网| 成年av动漫网址| 亚洲最大成人手机在线| 亚洲av美国av| 丰满的人妻完整版| 国产成人精品久久久久久| 熟妇人妻久久中文字幕3abv| 热99re8久久精品国产| 久久久久性生活片| 最近2019中文字幕mv第一页| 欧美日本亚洲视频在线播放| 亚洲美女视频黄频| av福利片在线观看| av天堂中文字幕网| 99热精品在线国产| 成人永久免费在线观看视频| 亚洲av二区三区四区| 国产精品亚洲一级av第二区| 欧美人与善性xxx| 亚洲精品456在线播放app| 最近最新中文字幕大全电影3| 国产精华一区二区三区| 最近2019中文字幕mv第一页| 亚洲人与动物交配视频| 日韩 亚洲 欧美在线| 99久久精品国产国产毛片| a级毛色黄片| 高清欧美精品videossex| 韩国高清视频一区二区三区| 大香蕉97超碰在线| 五月天丁香电影| 偷拍熟女少妇极品色| 一区二区三区四区激情视频| 日本爱情动作片www.在线观看| 日韩制服骚丝袜av| 国产国拍精品亚洲av在线观看| 少妇人妻 视频| 中国三级夫妇交换| 免费久久久久久久精品成人欧美视频 | 一级a做视频免费观看| 成人漫画全彩无遮挡| 丰满迷人的少妇在线观看| 午夜福利视频精品| 国精品久久久久久国模美| 日日摸夜夜添夜夜爱| 美女福利国产在线| 最近最新中文字幕免费大全7| 在线观看国产h片| 久久久久视频综合| 久久精品久久精品一区二区三区| 夜夜骑夜夜射夜夜干| 丰满乱子伦码专区| 亚洲婷婷狠狠爱综合网| 欧美少妇被猛烈插入视频| 国产精品秋霞免费鲁丝片| 国产男人的电影天堂91| 亚洲精品久久午夜乱码| 26uuu在线亚洲综合色| 哪个播放器可以免费观看大片| 一级,二级,三级黄色视频| 3wmmmm亚洲av在线观看| 插逼视频在线观看| 天天操日日干夜夜撸| 日本欧美视频一区| 欧美激情国产日韩精品一区| 国产探花极品一区二区| 国产精品一区二区三区四区免费观看| 国产免费一级a男人的天堂| 成年美女黄网站色视频大全免费 | 夜夜看夜夜爽夜夜摸| 国产精品国产三级专区第一集| 久久精品久久久久久噜噜老黄| 亚洲精品乱久久久久久| 亚洲av国产av综合av卡| 26uuu在线亚洲综合色| 啦啦啦在线观看免费高清www| 成人特级av手机在线观看| av国产精品久久久久影院| 中文字幕精品免费在线观看视频 | 国产精品久久久久久精品古装| 18禁在线无遮挡免费观看视频| 色婷婷久久久亚洲欧美| 国产成人aa在线观看| 九色成人免费人妻av| 一区二区三区四区激情视频| 成人漫画全彩无遮挡| 一本一本综合久久| 一级黄片播放器| 18禁在线播放成人免费| 国产在视频线精品| 亚洲欧美精品自产自拍| 日韩中字成人| 欧美日韩在线观看h| 欧美日韩视频精品一区| 一级毛片 在线播放| 亚洲真实伦在线观看| 少妇人妻 视频| 夫妻性生交免费视频一级片| 免费人妻精品一区二区三区视频| 交换朋友夫妻互换小说| 国产极品粉嫩免费观看在线 | 大又大粗又爽又黄少妇毛片口| 女人久久www免费人成看片| 久久毛片免费看一区二区三区| 你懂的网址亚洲精品在线观看| 另类亚洲欧美激情| 欧美日韩综合久久久久久| 亚洲欧美精品自产自拍| 黄色配什么色好看| 亚洲中文av在线| 日本黄色日本黄色录像| 久久av网站| 高清黄色对白视频在线免费看 | 日韩av免费高清视频| 中文精品一卡2卡3卡4更新| 久久久久久久久大av| 国产午夜精品久久久久久一区二区三区| 久久亚洲国产成人精品v| 在线天堂最新版资源| 老熟女久久久| 久久国产精品男人的天堂亚洲 | 高清黄色对白视频在线免费看 | 男女国产视频网站| 亚洲va在线va天堂va国产| 乱系列少妇在线播放| 中文字幕免费在线视频6| 麻豆成人午夜福利视频| 免费少妇av软件| www.av在线官网国产| 色5月婷婷丁香| 交换朋友夫妻互换小说| 天天操日日干夜夜撸| 26uuu在线亚洲综合色| 少妇裸体淫交视频免费看高清| 久久午夜综合久久蜜桃| 卡戴珊不雅视频在线播放| 99久久综合免费| 久热久热在线精品观看| 国产色婷婷99| 国产成人午夜福利电影在线观看| 我的老师免费观看完整版| 视频区图区小说| 日韩免费高清中文字幕av| 大香蕉97超碰在线| 如何舔出高潮| 伦精品一区二区三区| 三级经典国产精品| 国产精品一区二区在线观看99| 尾随美女入室| 久久久亚洲精品成人影院| 只有这里有精品99| 亚洲av成人精品一二三区| 欧美精品一区二区大全| 成人国产av品久久久| 国产淫语在线视频| 亚洲国产精品999| 最近中文字幕2019免费版| 久久久亚洲精品成人影院| 亚洲av.av天堂| 精品久久国产蜜桃| 国产一区二区三区av在线| 99国产精品免费福利视频| 三级国产精品欧美在线观看| 亚洲成色77777| 狂野欧美激情性bbbbbb| 亚洲av不卡在线观看| 99热这里只有精品一区| 内射极品少妇av片p| 九草在线视频观看| 欧美精品一区二区免费开放| 777米奇影视久久| 亚洲欧美清纯卡通| 亚洲av综合色区一区| 尾随美女入室| 久久久久久久久久成人| 日产精品乱码卡一卡2卡三| 国产一区亚洲一区在线观看| 色视频在线一区二区三区| 日韩一区二区三区影片| 十八禁高潮呻吟视频 | 十分钟在线观看高清视频www | 欧美xxxx性猛交bbbb| 2021少妇久久久久久久久久久| 观看免费一级毛片| 国产成人一区二区在线| 国内少妇人妻偷人精品xxx网站| 亚洲美女黄色视频免费看| 国国产精品蜜臀av免费| 日韩成人av中文字幕在线观看| 精品人妻熟女av久视频| 久久久a久久爽久久v久久| 亚洲中文av在线| 日韩精品有码人妻一区| 亚洲欧洲精品一区二区精品久久久 | 婷婷色综合大香蕉| 国产欧美另类精品又又久久亚洲欧美| 午夜久久久在线观看| 黄色视频在线播放观看不卡| 春色校园在线视频观看| 日日啪夜夜爽| 99久久人妻综合| 三级经典国产精品| 午夜福利,免费看| 哪个播放器可以免费观看大片| 噜噜噜噜噜久久久久久91| 久久97久久精品| 中文字幕制服av| 99热这里只有是精品50| 99re6热这里在线精品视频| 国产精品不卡视频一区二区| 亚洲精品乱码久久久久久按摩| 国产精品一区二区性色av| 如日韩欧美国产精品一区二区三区 | 伊人久久国产一区二区| 黄色配什么色好看| 亚洲欧美成人综合另类久久久| 欧美日韩一区二区视频在线观看视频在线| 欧美bdsm另类| 亚洲欧美日韩卡通动漫| 一级av片app| 成年人免费黄色播放视频 | 在线观看av片永久免费下载| 大片电影免费在线观看免费| 少妇猛男粗大的猛烈进出视频| 午夜免费鲁丝| 久久久久久久大尺度免费视频| 亚洲精品国产av蜜桃| 日韩在线高清观看一区二区三区| 少妇高潮的动态图| 性高湖久久久久久久久免费观看| 两个人的视频大全免费| 精品久久国产蜜桃| 欧美日韩av久久| 国产无遮挡羞羞视频在线观看| 日韩一区二区视频免费看| 成人亚洲欧美一区二区av| 色婷婷av一区二区三区视频| 亚洲国产欧美在线一区| 国产精品99久久99久久久不卡 | 99视频精品全部免费 在线| 国产精品国产三级国产av玫瑰| 如日韩欧美国产精品一区二区三区 | 两个人免费观看高清视频 | av又黄又爽大尺度在线免费看| 国产淫片久久久久久久久| 午夜影院在线不卡| 少妇的逼水好多| 国产精品成人在线| 精品久久久久久久久亚洲| 久久久国产一区二区| 国产男女内射视频| 亚洲婷婷狠狠爱综合网| 欧美成人午夜免费资源| 国产免费又黄又爽又色| 看免费成人av毛片| 99热国产这里只有精品6| 青春草亚洲视频在线观看| 国产免费一区二区三区四区乱码| 欧美精品高潮呻吟av久久| 男女免费视频国产| 亚洲av.av天堂| 我要看黄色一级片免费的| 久久国产精品男人的天堂亚洲 | 好男人视频免费观看在线| 特大巨黑吊av在线直播| 国产白丝娇喘喷水9色精品| 欧美日韩精品成人综合77777| 国产高清国产精品国产三级| 特大巨黑吊av在线直播| 免费少妇av软件| 日本与韩国留学比较| 日韩大片免费观看网站| 亚洲va在线va天堂va国产| 亚洲欧美日韩卡通动漫| 久久久a久久爽久久v久久| 亚洲av二区三区四区| xxx大片免费视频| 女性被躁到高潮视频| 男人和女人高潮做爰伦理| av视频免费观看在线观看| 久久99精品国语久久久| 久久久久久久久大av| 曰老女人黄片| 久热这里只有精品99| 亚洲,欧美,日韩| 久久精品久久久久久噜噜老黄| 自拍偷自拍亚洲精品老妇| 我要看日韩黄色一级片| 天堂俺去俺来也www色官网| 国产欧美日韩精品一区二区| 天堂中文最新版在线下载| 97超视频在线观看视频| 免费观看的影片在线观看| 亚洲精品乱码久久久v下载方式| 成年人免费黄色播放视频 | 久久午夜综合久久蜜桃| 国产黄片视频在线免费观看| 久久精品久久久久久久性| 赤兔流量卡办理| 亚洲精品视频女| 成人18禁高潮啪啪吃奶动态图 | 热re99久久国产66热| 国产精品久久久久久精品古装| 波野结衣二区三区在线| 一本—道久久a久久精品蜜桃钙片| 日韩欧美精品免费久久| 亚洲图色成人| 亚洲性久久影院| 午夜日本视频在线| 又大又黄又爽视频免费| 亚洲欧美清纯卡通| 极品教师在线视频| 国产免费福利视频在线观看| 一本—道久久a久久精品蜜桃钙片| 在线 av 中文字幕| 18+在线观看网站| 深夜a级毛片| 国产精品久久久久久av不卡| 亚洲图色成人| 免费观看av网站的网址| 国产亚洲午夜精品一区二区久久| 色视频在线一区二区三区| 国产高清三级在线| 国产成人freesex在线| 国产在线一区二区三区精| av.在线天堂| 最新的欧美精品一区二区| 一级毛片电影观看| 亚洲国产精品国产精品| 久久99一区二区三区| 日本午夜av视频| 女性被躁到高潮视频| 日本黄色日本黄色录像| 纵有疾风起免费观看全集完整版| 嫩草影院新地址| 九九爱精品视频在线观看| 国产精品一区二区三区四区免费观看| 七月丁香在线播放| 美女xxoo啪啪120秒动态图| 久久久a久久爽久久v久久| 永久网站在线| 成人亚洲欧美一区二区av| 日韩强制内射视频| 99热这里只有精品一区| 日韩成人伦理影院| 在线观看免费日韩欧美大片 | 天天操日日干夜夜撸| 亚洲久久久国产精品| 亚洲av福利一区| 久久韩国三级中文字幕| 国产在线男女| 亚洲欧洲国产日韩| 好男人视频免费观看在线| 精品亚洲成国产av| 边亲边吃奶的免费视频| xxx大片免费视频| 亚洲精品中文字幕在线视频 | 久久久久国产精品人妻一区二区| 国产毛片在线视频| 亚州av有码| 亚洲av福利一区| 三上悠亚av全集在线观看 | 国产一区二区在线观看av| 春色校园在线视频观看| 亚洲图色成人| 在线看a的网站| 国产欧美另类精品又又久久亚洲欧美| 欧美xxxx性猛交bbbb| 久久精品国产亚洲av涩爱| 亚州av有码| 久久国内精品自在自线图片| 国产亚洲av片在线观看秒播厂| 日韩人妻高清精品专区| 国产亚洲91精品色在线| 国产精品熟女久久久久浪| 免费看不卡的av| 欧美最新免费一区二区三区| 菩萨蛮人人尽说江南好唐韦庄| 人妻系列 视频| 黄色一级大片看看| 色哟哟·www| 亚洲精品中文字幕在线视频 | 男女边摸边吃奶| 国产精品免费大片| 亚洲欧美精品专区久久| 国产av国产精品国产| 春色校园在线视频观看| 精品亚洲成国产av| 久久99蜜桃精品久久| 视频中文字幕在线观看| h视频一区二区三区| 国产精品偷伦视频观看了| 老司机影院成人| 日本91视频免费播放| av免费观看日本| 我要看黄色一级片免费的| 精品亚洲乱码少妇综合久久| 中文天堂在线官网| 国语对白做爰xxxⅹ性视频网站| 日韩电影二区| 97超视频在线观看视频| 婷婷色av中文字幕| 欧美精品人与动牲交sv欧美| 久久久久久伊人网av| 晚上一个人看的免费电影| 免费看光身美女| 国产精品欧美亚洲77777| 在线观看一区二区三区激情| 美女大奶头黄色视频| 精品少妇久久久久久888优播| 国产高清三级在线| 日韩精品有码人妻一区| 91久久精品国产一区二区三区| 精品亚洲成国产av| 天天躁夜夜躁狠狠久久av| av国产久精品久网站免费入址| 亚洲av成人精品一区久久| 日韩不卡一区二区三区视频在线| 日韩一区二区三区影片| 一级毛片黄色毛片免费观看视频| 男女国产视频网站| 国产日韩欧美在线精品| 伊人久久精品亚洲午夜| 亚洲av电影在线观看一区二区三区| 久久久久久久久久久丰满| 91aial.com中文字幕在线观看| 亚洲国产精品成人久久小说| 国产av国产精品国产| 午夜视频国产福利| 久久国产乱子免费精品| av国产久精品久网站免费入址| 青春草视频在线免费观看| 精品一区二区免费观看| 国产成人aa在线观看| 欧美高清成人免费视频www| av又黄又爽大尺度在线免费看| 久久久久久人妻| 国产成人午夜福利电影在线观看| 亚洲自偷自拍三级| 97在线人人人人妻| 寂寞人妻少妇视频99o| 好男人视频免费观看在线| 久久鲁丝午夜福利片| 精品久久久久久久久亚洲| 国产黄色视频一区二区在线观看| 久久免费观看电影| 91aial.com中文字幕在线观看| 亚洲欧美一区二区三区国产| 国产一区二区三区综合在线观看 | 91精品一卡2卡3卡4卡| 日本wwww免费看| 国产欧美日韩一区二区三区在线 | 亚洲久久久国产精品| 亚洲精品国产色婷婷电影| 亚洲三级黄色毛片| 欧美精品一区二区大全| 亚洲四区av| 人妻一区二区av| 交换朋友夫妻互换小说| 春色校园在线视频观看| 亚洲精品一二三| 多毛熟女@视频| 亚洲国产色片| av福利片在线观看| 99九九线精品视频在线观看视频| 大又大粗又爽又黄少妇毛片口| 亚洲一级一片aⅴ在线观看| 少妇熟女欧美另类| 在线观看一区二区三区激情| 爱豆传媒免费全集在线观看| 久热久热在线精品观看| 精品国产露脸久久av麻豆| 人人妻人人添人人爽欧美一区卜| 国产成人一区二区在线| 欧美精品国产亚洲| 中文字幕精品免费在线观看视频 | 妹子高潮喷水视频| 男的添女的下面高潮视频| 中文字幕av电影在线播放| 日韩电影二区| 久久久国产一区二区| a级一级毛片免费在线观看| 欧美bdsm另类| 成年av动漫网址| av有码第一页| 青青草视频在线视频观看| 丝瓜视频免费看黄片| 欧美日韩一区二区视频在线观看视频在线| 天美传媒精品一区二区| kizo精华| 777米奇影视久久| 内地一区二区视频在线| 五月伊人婷婷丁香| 国产淫片久久久久久久久| 寂寞人妻少妇视频99o| 啦啦啦中文免费视频观看日本| 亚洲av在线观看美女高潮| 少妇人妻 视频| 纯流量卡能插随身wifi吗| 欧美日韩综合久久久久久| 一个人看视频在线观看www免费| 国产高清国产精品国产三级| 少妇的逼水好多| 亚洲欧洲国产日韩| 老熟女久久久| 五月开心婷婷网| 亚洲欧美日韩卡通动漫| 久久精品国产亚洲av涩爱| 少妇人妻 视频| 18禁动态无遮挡网站| 色网站视频免费| 日韩免费高清中文字幕av| 全区人妻精品视频| 精品久久久久久久久av| 成人午夜精彩视频在线观看| 午夜久久久在线观看| 黄色欧美视频在线观看| 国产亚洲欧美精品永久| 99热这里只有是精品50| 国产免费一级a男人的天堂| 国产淫片久久久久久久久| 国产黄色免费在线视频| 午夜老司机福利剧场| 人体艺术视频欧美日本| 性色avwww在线观看| 亚洲欧美精品自产自拍| av免费在线看不卡| 午夜影院在线不卡| 精品一区二区免费观看| 中文在线观看免费www的网站| 日韩一本色道免费dvd| 麻豆乱淫一区二区| 免费观看av网站的网址| 国产片特级美女逼逼视频| 免费av不卡在线播放| 日韩欧美精品免费久久| 日本-黄色视频高清免费观看| 亚洲综合色惰| 欧美 日韩 精品 国产| 搡女人真爽免费视频火全软件| 免费大片18禁| 亚洲欧美日韩东京热| 王馨瑶露胸无遮挡在线观看| 亚洲在久久综合| 久久97久久精品| 亚洲在久久综合| 国产黄片视频在线免费观看| 男女无遮挡免费网站观看| 狂野欧美激情性xxxx在线观看| 91精品一卡2卡3卡4卡| 亚洲内射少妇av| 国产av一区二区精品久久| 女人精品久久久久毛片| 一个人免费看片子| 国产男女内射视频| 免费看日本二区| 日韩制服骚丝袜av| 黑人巨大精品欧美一区二区蜜桃 | 国产午夜精品一二区理论片| av免费在线看不卡| 国产精品久久久久成人av| av网站免费在线观看视频| 国产精品一二三区在线看| 亚洲精品亚洲一区二区| 国产无遮挡羞羞视频在线观看| 亚洲av日韩在线播放| 午夜激情福利司机影院| 日韩熟女老妇一区二区性免费视频| 精品国产国语对白av| av有码第一页| 丰满少妇做爰视频| 伦理电影大哥的女人| 欧美 亚洲 国产 日韩一| 亚洲av中文av极速乱| 国产无遮挡羞羞视频在线观看| 永久网站在线| 久久久久国产精品人妻一区二区| 国产免费一级a男人的天堂| 亚洲精品第二区| 在线观看免费视频网站a站| 偷拍熟女少妇极品色| 99久久精品一区二区三区| 人人妻人人添人人爽欧美一区卜| 亚洲性久久影院| 嫩草影院新地址| 高清午夜精品一区二区三区| 亚洲国产精品成人久久小说| 熟女电影av网| 亚洲精品久久午夜乱码| 插逼视频在线观看| 久久精品夜色国产| 看非洲黑人一级黄片| 久久精品国产亚洲网站| 亚洲欧洲国产日韩| 中文在线观看免费www的网站| 又大又黄又爽视频免费| 老司机亚洲免费影院| 最新的欧美精品一区二区| 国产91av在线免费观看| freevideosex欧美| 中文字幕亚洲精品专区| 黑人高潮一二区| 久久久国产欧美日韩av| a级毛片免费高清观看在线播放| 精品人妻一区二区三区麻豆| 欧美国产精品一级二级三级 | 国产视频首页在线观看| 如何舔出高潮| 五月玫瑰六月丁香| 国产毛片在线视频| 午夜老司机福利剧场| 欧美变态另类bdsm刘玥| 久久99热这里只频精品6学生| h日本视频在线播放|