• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Service Robot Localization Based on Global Vision and Stereo Vision

    2012-02-07 07:46:48YUQingxiao于清曉YANWeixin閆維新FUZhuangZHAOYanzheng趙言正

    YU Qing-xiao(于清曉),YAN Wei-xin(閆維新),F(xiàn)U Zhuang(付 莊),ZHAO Yan-zheng(趙言正)*

    1 State Key Laboratory of Mechanical System and Vibration,Shanghai Jiaotong University,Shanghai 200240,China

    2 State Key Laboratory of Robotics and System,Harbin Institute of Technology,Harbin 150001,China

    Introduction

    Mobile robot localization and object pose estimation in a working environment have been central research activities in mobile robotics.Solution of mobile robot localization needs to address two main problems[1]:the robotmusthave a representation of the environment;the robot must have a representation of its belief regarding its pose in this environment.Sensors are the basis of addressing both problems.Many of the sensors available to mobile robots are introduced in Ref.[1],giving their basic principles and performance limitations.Based on this introduction,ultrasonic sensors[2,3],goniometers[4],laser range finders[5,6],and charge coupled device(CCD)cameras[7,8]are sensors which are commonly applied in mobile robot localization projects to gather high precision information on the robot's pose estimation.

    Most mobile robot localization approaches can be widely classified into three major types:metric,topological,and hybrid.Metric approaches[9,10]are useful when it is necessary for the robot to know its position accurately in terms of metric coordinates.And,the state of the robot can also be represented in a more qualitative manner,by using a topological map[11,12].Butthese methods confrontmany difficulties when the environments are changed.A vision-based localization and mapping algorithm which used scale-invariant image feature(STIF)was applied for mobile robot localization and map building in Ref.[13].Miro et al.[14]used the binocular vision system to generate the disparity map by matching feature and realized the self-localization and self-navigation.Grzyb et al.[15]adopted stereo images and biologically inspired algorithms to accurately estimate the pose,size,and shape of the target object.Greggio et al.[16]applied stereo cameras and a real-time tracking algorithm to realize pattern recognition and evaluate the 3D position of a generic spatial object.Chinellato et al.[17]proposed a visual analysis which was based on a computational model of distance and orientation estimation inspired by human visual mechanisms to extract the features and pose of the object.

    Considering the positioning cost and computational efficiency,a hierarchical localization method based on vision is proposed to provide the absolute coordinates in this paper.The proposed algorithm can estimate the real-time coordinates well,and obtain up to±2 cm positioning accuracy to make sure that the robot can successfully grab the object.

    1 Hierarchical Positioning Method

    1.1 Global vision-based coarse localization

    The estimation of the mobile robot's pose is a fundamental problem,which can be roughly divided into two classes[13]:methods for keeping track of the robot's pose and methods for global pose estimation.Most ofthe present research has concentrated on the first class,which assumes that the initial pose of the robot is known[18].In this study,both methods are used to position the mobile robot in the initial stage.The global vision-based localization method is used to calculate the initial position oftherobotand periodically update itscurrent coordinates,and the dead-reckoning method is also used to localize the robot as a supplementary method.In the following part,the global vision-based localization method is explained in detail.

    One monocular camera mounted on the ceiling is used to acquire and transmit the global image to the robot through the wireless network.The image is firstly processed to remove the image distortion.Then,color segmentation method is used to position the robot in the image and distinguish the color mark on the head in the Hue-Saturation-Value(HSV)color space.After distinguishing and extracting the color mark,the coordinates of the color mark can be obtained to calculate the coordinates of the robot,according to the global vision model shown in Fig.1.

    In Fig.1,(0,0)denotes the origin of the camera coordinate system;HCdenotes the height of the camera relative to the floor;HRdenotes the height of the robot;(XRC,YRC)is the center coordinate of the robot;(XHC,YHC)is the center coordinate of the color mark.

    According to the geometric relationship,the following equation can be obtained:

    Then,the following results can be calculated as:

    Fig.1 The global vision camera model

    In Eq.(2),we know that(XRC,YRC)is dependent on the values of HC,HR,and(XHC,YHC).But the values of HCand HRare invariant in this study,and(XRC,YRC)is only dependent on(XHC,YHC).

    Four coordinate systems:global coordinate system(GCS),robot coordinate system(RCS),sensor coordinate system (SCS),and virtualglobalcoordinate system(VGCS)[19]are all under consideration.So,the appropriate transformation between VGCS and SCS should be acquired to calculate the absolute coordinates of the robot.According to the geometric relationship in Fig. 1, the homogeneous transformation matrix T between OXYZ and O′X′Y′Z′can be obtained as:

    Then,the coordinates of the robot in VGCS can be calculated as:

    In the study,VGCS is coincident with GCS in addition to different scale.So,the absolute coordinates of the robot can be achieved as:

    where Kheightand Kwidthare the relative scales to the VGCS;m and n are shift distances of VGCS relative to SCS.

    In this way,the robot can know its initial position wherever it is placed and periodically update its current coordinates.Furthermore,the real-time positioning of the robot can be realized in indoor environment with the help of the deadreckoning method.

    1.2 Stereo vision-based precise localization

    When the service robot moves into the area where the object is placed under the guidance of the global vison,the binocular vision-based localization method is adopted to provide high positioning accuracy to successfully grab the object for the customer.In this study,color segmentation and shape-based matching[20]are separately applied to detect the object,and the extracted results are fused to obtain the accurate object.The color segmentation method has been discussed.Below,we describe the shape-based matching method.

    The advantage ofshape-based matching isitsgreat robustness and flexibility.Instead of using the gray values,features along contours are extracted and used for the model generation and matching.The method is invariant to changes in illumination and variations of object gray values.The method also allows the object to be rotated and scaled.The process of shape-based matching is divided into two distinct phases:the training phase and the recognition phase.In the first phase,the template model should be created and trained.In the second phase,the model is used to find and localize the object in the image.The basic concept of shape-based image matching is shown in Fig.2.

    Fig.2 The basic concept of shape-based image matching

    Using the mentioned mehods,the regions of interest(ROI)are seperately obtained. Then,the following information fusion method is adopted to find the expected region of the object.Suppose that image regions A are obtained using color segmentation method and image regions B are also obtained using shape-based matching method.The center points of image regions A and B are all extracted and compared one by one.If the difference between the center coordinates of one A image region and the center coordinates of one B image region is less than the default constant ε,the image area belongs to the expected regions of interest.Otherwise,the image area is the error detected region.Then,the expected regions of interest can be obtained as:

    where(xi,yi)is the center coordinates of the sub-region of the image regions A;(xj,yj)is the center coordinates of the j subregion of the image regions B.

    According to Eq.(6),the expected regions can be obtained to find and match the object.After obtaining the plate region,the normalized cross-correlation(NCC)matching method[21]is adopted to obtain the depth image.Then,the 3D coordinates of the color mark,which is on the object,can be calculated using the binocular vision system.To localize the mobile robot,the homogeneous transformation matrix between binocular coordinate system and RCS should be firstly obtained.Figure 3 shows the relationship between binocular coordinate system and RCS.

    Fig.3 The diagram of binocular coordinate system and RCS

    In Fig.3,α and θ are the rotation degrees of the head;O′X′Y″Z denotesRCS,and O″X″Y″Z″ denotesbinocular coordinate system.When α and θ are equal to zero,O″X″Y″Z″shifts negative 6 cm in the X-axis direction relative to O′X′Y′Z′.According to these information, the homogeneous transformation matrix T between O′X′Y′Z′and O″X″Y″Z″can be got as:

    Then,the coordinates of the color mark in RCS can be calculated as :

    where[x′ y′ z′]Tis the location of the color mark in RCS;[x″ y″ z″]Tis the location of the color mark in binocular coordinate system.

    In addition to obtain the 3D coordinates of the color mark,the angle of the object relative to the robot should be also calculated by extracting and filtering the edge lines of the object.Detailed process is as follows.Firstly,the region of the object is processed using the expansion algorithm and canny operator.Then,the extracted edge lines are filtered using the threshold filter to obtain the available edge lines which can denote the angle of the object.Then,the selected edge lines are filtered using the mean filter,which can be used to calculate the angle of the object.Finally,the angle of the object relative to the robot,φobject,can be reckoned as follows.

    where φ denotes the detected angle of the object in the image and θ is the rotation angle of the neck joint.

    After determining the position of the color mark,the robot can adjust the velocity and heading angle to move towards the object under the guidance of the location-based visual servo control system.Periodically localize the position of the color mark using the method above and adjust the pose of the robot until the robot successfully realizes its precise positioning relative to the color mark.The location-based visual servo control system is shown in Fig.4.

    Fig.4 The chart of the location-based visual servo control system

    2 Experiments and Result Analysis

    2.1 Experiment setup

    The experimental platform is a restaurant robot as shown in Fig.5 which is equipped with one binocular camera.The model of the binular camera is bumblebee2 which is made by Point Grey Research.There is a global camera mounted on the ceiling which is used to obtain the whole environmental image.The global images with 768×576 pixels are sent to the mobile robot through the wirless ethernet.The binocular camera mounted on the robot head can send the stereo images with 512×384 piexls to the robot through 1 394 bus.Before the experiment,the binocular camera has been calibrated in advance.

    The complete movement of the service robot can be divided into two parts:coarse localization and precise localization.In the following parts,we explain the experimental results of the complete movement from the two aspects.

    Fig.5 The photo of the restaurant service robot

    2.2 Global vision-based coarse localization

    In this part,the global vision localization method can be explained through the following example in this study.The mobile robot receives the image of the restaurant from the global camera through the wirless ethernet.To acquire the available value from the image,we should firstly rectify the raw image.Figure 6 is the rectified global image obtained from the camera.To localize the robot in the image,we should find the ROI and detect the color mark on the robot head using the color segmentation method.The result of the image processing is shown in Fig.7.

    After the location of the robot is detected in the image,the location of the robot relative to the pinhole camera coordinate system should be calculated.We have known the size of the picture in advance.And the height of the camera relative to the floor and the height of the robot are invariant and also known in advance.Because the position and the parameters of the global camera are invariant,the scales of the pixel relative to the one centimeter in the X-axis and Y-axis directions can be acquired.These parameters which are used to calculate the location of the robot,are all shown in Table 1.

    Table 1 The used parameters in this experiment

    In Fig.7,the origin of the picture coordinate system is in the top left-hand corner of the picture.And the coordinates of the color mark in the picture coordinate system have been calculated.According to Fig.1,we obtain the transformation coordinates,(X'HC,Y'HC),in the pinhole coordinate system.Then,the result can be calculated as(-68.50,- 311.33).After(X'HC,Y'HC)is obtained,the robot can be positioned in the image as(-39.14,-177.90).

    Next,the real coordinates of the robot in GCS should be reckoned.The origin of GCS is in the top right-hand of the carpet array shown in Fig.7.And,the coordinates of the top right-hand of the carpet array is(216,237)in the pinhole coordinate system.So,the coordinates of the robot in GCS is calculated as

    Many experiment results show that the accuracy of global vision positioning is within ±7 cm.In this study,the coordinates of the robot is updated every 100 ms to make sure that the robot can obtain its real-time coordinates.

    2.3 Stereo vision-based precise localization

    When the robot moves into the area where the object is placed under the guidance of the global vison,the stereo visionbased localization method is adopted toprovide the high positioning accuracy.The acquired images from binocular camera are firstly rectified using image correction algorithm to prepare the available images for the next process.The rectified image from the stereo vision during the robot movement is shown in Fig.8.

    Fig.8 The rectified stereo vision image

    In this study,the image taken by the right camera is used as the reference image where the ROI is extracted to realize the object feature matching and obtain the disparity map of the object.The object region is detected and extracted using color segmentation and shape-based matching methods.The procedure of color segmentation is as follows.Firstly,HSV color spaces could be obtained through the image color space conversion.Then,the appropriate thresholds are set in the Hue and Saturation space components.Finally,the plate region is extracted from the reference image.And the plate region can be also found in the image using the shape-based matching method.Each center coordinates of the extracted regions of interest can be firstly extracted,then,the expected region can be obtained according to Eq.(6).The plate region in Fig.9 is the expected region which is obtained according to the methods above.

    Fig.9 The expected region of interest

    Next,we should find the characteristic sign in the expected region and match image using NCC algorithm.In the process,the consistency test between the left and right images is also used to find the accuracy matching.The obtained depth image is shown in Fig.10.

    Fig.10 The obtained depth image using NCC

    From Fig.10,we can know that the depth value of the characteristic sign is 40.07.According to the binocular vision system,the coordinates of the characteristic mark can be calculated as[1.4 0.2 118.9]T.Otherwise,using the encoders,we can know that α =45°and θ=0°.Then,using Eq.(7),the homogeneous transformation matrix T,between RCS and binocular vision coordinate system is denoted as follows:

    Finally,according to Eq.(8),the result can be reckoned as[x′ y′ z′]=[- 4.6 88.5 83.9].Otherwise,the angle of the object relative to the robot should be calculated to determine the position of the color mark.In experiment,firstly,the edge lines of the object are extracted using canny operator.Then,the extracted edge lines are filtered using the threshold filter to find the available lines.Finally,the selected lines are filtered using the mean filter to calculate the angle of the object.The obtained lines using the method above are shown in Fig.11.From Fig.11,we know that the angle of the object is-3.865°.

    Fig.11 The extracted edge lines of the object

    After determining the position of the color mark,the robot can adjust the velocity and heading angle to move towards the object under the guidance of the location-based visual servo control system.And in the process of the movement,the robot should constantly repeat the positioning process above and adjust its pose until the precise positioning is finally realized.

    We performed the whole movement at the velocity of 0.15 m/s for sixty times in our library,and the initial position of the robot was random and unknown each time.And,the service robot could always realize self-localization with the positioning accuracy up to±2 cm and successfully grab the object for the customers within 3 min.The statistical data are shown in Table 2.

    Table 2 The application and positioning accuracy of the hierarchical localization method

    3 Conclusions and Future Work

    In this paper,a method of mobile robot localization which is based on visual feedback is proposed and applied on the restaurant service robot.The whole positioning process can be divided into two stages:coarse positioning and precise positioning.Hierarchical positioningmethod is adopted to provide different positioning accuracies in different stages.In the coarse positioning stage,the robot can periodically obtain its current coordinates using the global vision-based localization method.In the precise positioning stage,the robot can obtain higher localization precision using the binocular vision-based localization method.Finally,the robot can successfully grasp the plate on the table.Many experiments verify that the proposed algorithm has good location effect and obtains up to±2 cm positioning accuracy.

    Currently,research on the improvement in the positioning accuracy and speed of the algorithm is under way.And our future work will also focus on improving the anti-interference ability to the more complex environment.

    [1]Siegwart R,Nourbakhsh I.Introduction to Autonomous Mobile Robots[M].Cambridge:The MIT Press,2004:121-156.

    [2]Choi B S,Lee J J.Mobile Robot Localization Scheme Based on RFID andSonarFusion System [C].IEEE International Symposium on Industrial Electronics,Seoul,Korea,2009:1035-1040.

    [3]Choi B S,Lee J J.Localization of a Mobile Robot Based on an Ultrasonic Sensor Using Dynamic Obstacles[J].Artificial Life and Robotics,2008,12(1/2):280-283.

    [4]Bonnifait P,Garcia G.Design and Experimental Validation of an Odometric and Goniometric Localization System for Outdoor Robot Vehicles[J].IEEE Transactionson Roboticsand Automation,1998,14(4):541-548.

    [5]Balaguer B,Carpin S,Balakirsky S. Towards Quantitative Comparisons of Robot Algorithms:Experiences with SLAM in Simulation and Real World Systems[C].IEEE/RSJ International Conference on Intelligent Robots and Systems,California,USA,2007:1-7.

    [6]Lin H H,Tsai C C.Laser Pose Estimation and Tracking Using Fuzzy Extended Information Filtering for an Autonomous Mobile Robot[J].Journal of Intelligent and Robotic Systems,2008,53(2):119-143.

    [7]Munguia R,Grau A.Monocular SLAM for Visual Odometry[C]. IEEE InternationalSymposium on IntelligentSignal Processing,Madrid,Spain,2007:1-6.

    [8]Lin H Y,Lin J H,Wang M L.A Visual Positioning System for Vehicle or Mobile Robot Navigation[C].Proceedings of the 8th InternationalIEEE Conference on Intelligent Trasportation Systems,Vienna,Austria,2005:73-78.

    [9]Blanco J L,Gonzalez J,F(xiàn)ernandez-Madrigal J A.Consistent Observation Grouping for Generating Metric-Topological Maps that Improves Robot Localization[C].Proceeding of IEEE International Conference on Robotics and Automation,Orlando,USA,2006:818-823.

    [10]Guo Y,Xu X H.Color Landmark Design for Mobile Robot Localization[C].IMACS Multiconference on Computational Engineering in Systems Applications,Beijing,China,2006:1868-1874.

    [11]Kwon T B,Yang J H,Song J B,et al.Efficiency Improvement in Monte Carlo Localization through Topological Information[C].Proceeding of 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems,Beijing,China,2006:424-429.

    [12]Tapus A,Siegwart R.A Cognitive Modeling of Space Using Fingerprints ofPlaces for Mobile Robot Navigation [C].Proceedings of the IEEE International Conference on Robotics and Automation,Orlando,USA,2006:1188-1193.

    [13]Borenstein J,Everett B,F(xiàn)eng L.Navigation Mobile Robot:System and Techniques[M].Wellesley,USA:A.K.Peters Ltd.,1996:103-150.

    [14]Miro J V,Zhou W Z,Dissanayake G.Towards Vision Based Navigation in Large Indoor Environments[C]. IEEE International Conference on Intelligent Robots and Systems,Beijing,China,2006:2096-2102.

    [15]Grzyb B,Chinellato E,Morales A,et al.A 3D Grasping System Based on MultimodalVisualand TactileProcessing [J].Industrial Robot:An International Journal,2009,36(4):365-369.

    [16]Greggio N,Bernardino A,Laschi C,et al.Real-Time 3D Stereo Tracking and Localizing of Spherical Objects with the iCub Robotic Platform [J].Journal of Intelligent and Robotic Systems,2011,63(3/4):417-446.

    [17]Chinellato E,Grzyb B J,del Pobil A P.Brain Mechanisms for Robotic ObjectPose Estimation[C]. InternationalJoint Conference on Neural Networks,Hong Kong,China,2008:3268-3275.

    [18]Lang H X,Wang Y,de Silva C W.Mobile Robot Localization and Object Pose Estimation Using Optical Encoder,Vision and Laser Sensors[C].Proceedings of the IEEE International Conference on Automation and Logistics,Qingdao,China,2008:617-622.

    [19]Yu Q X,Yuan C,F(xiàn)u Z,et al.Research of the Localization of Restaurant Service Robot[J].International Journal of Advanced Robotic Systems,2010,7(3):227-238.

    [20]El Munim H A,F(xiàn)arag A A. Shape Representation and Registration Using Vector Distance Functions[C].Proceedings of theIEEE Conference on ComputerVision and Pattern Recognition,Minneapolis,USA,2007:1-8.

    [21]Sun Z X,Wu Q.TS201 Based Fast Algorithm of Normalized Cross-Correlation[J].Modern Electronics Technique,2010,33(10):125-127.(in Chinese)

    人妻少妇偷人精品九色| 中国美白少妇内射xxxbb| 男女边吃奶边做爰视频| 免费看光身美女| 啦啦啦韩国在线观看视频| 深爱激情五月婷婷| 91久久精品国产一区二区三区| 嫩草影院新地址| 国产精品久久视频播放| 国模一区二区三区四区视频| 国产老妇伦熟女老妇高清| 久久久久久大精品| 成年女人永久免费观看视频| a级毛片免费高清观看在线播放| 免费av毛片视频| 成人毛片60女人毛片免费| 黄色欧美视频在线观看| 国产成人a区在线观看| 国产高清三级在线| 少妇熟女欧美另类| 国产精品国产三级国产专区5o | 看免费成人av毛片| 精品国产露脸久久av麻豆 | 国产成人一区二区在线| 看免费成人av毛片| 免费观看在线日韩| 国产三级中文精品| 久久精品久久久久久噜噜老黄 | 天堂网av新在线| 亚洲无线观看免费| 岛国毛片在线播放| 一级黄色大片毛片| 纵有疾风起免费观看全集完整版 | 丰满人妻一区二区三区视频av| 卡戴珊不雅视频在线播放| 亚洲成人久久爱视频| 干丝袜人妻中文字幕| 韩国av在线不卡| 日本黄色片子视频| 国产亚洲一区二区精品| 中文乱码字字幕精品一区二区三区 | 成人毛片60女人毛片免费| 亚洲自拍偷在线| 国产成人91sexporn| 男女边吃奶边做爰视频| av卡一久久| 日韩亚洲欧美综合| 成人鲁丝片一二三区免费| 久久久a久久爽久久v久久| 国产又色又爽无遮挡免| 三级男女做爰猛烈吃奶摸视频| 日本免费一区二区三区高清不卡| 99热网站在线观看| 日本av手机在线免费观看| 水蜜桃什么品种好| 真实男女啪啪啪动态图| 国产午夜精品论理片| 午夜福利在线观看免费完整高清在| 中文字幕av成人在线电影| 久久这里只有精品中国| 日韩强制内射视频| 午夜爱爱视频在线播放| 美女被艹到高潮喷水动态| 国产免费一级a男人的天堂| 免费不卡的大黄色大毛片视频在线观看 | 久久久久九九精品影院| a级毛色黄片| 亚洲av成人精品一区久久| 久久久精品大字幕| 女人十人毛片免费观看3o分钟| 性插视频无遮挡在线免费观看| 精品国产三级普通话版| 秋霞伦理黄片| 欧美丝袜亚洲另类| 久久久久久久久久久丰满| 国产亚洲精品av在线| 人妻夜夜爽99麻豆av| 色吧在线观看| 一个人观看的视频www高清免费观看| 久久久久性生活片| 一级毛片aaaaaa免费看小| 免费观看精品视频网站| 国产午夜福利久久久久久| 精品一区二区免费观看| 一级二级三级毛片免费看| 亚洲欧美日韩高清专用| 国产久久久一区二区三区| 成人午夜精彩视频在线观看| 中文亚洲av片在线观看爽| 国产精品无大码| 亚洲欧洲日产国产| 午夜免费激情av| 桃色一区二区三区在线观看| av在线老鸭窝| 免费av不卡在线播放| 久久久久久九九精品二区国产| 99久久精品热视频| 亚洲aⅴ乱码一区二区在线播放| 免费看日本二区| 日韩一区二区视频免费看| 国产免费福利视频在线观看| 午夜a级毛片| 丝袜美腿在线中文| 欧美成人精品欧美一级黄| 尾随美女入室| 免费看a级黄色片| 18禁在线播放成人免费| 精品欧美国产一区二区三| 国产又黄又爽又无遮挡在线| 欧美最新免费一区二区三区| 日韩人妻高清精品专区| www日本黄色视频网| 精品国产三级普通话版| 亚洲av成人av| 亚洲va在线va天堂va国产| 干丝袜人妻中文字幕| 成人美女网站在线观看视频| 成人亚洲欧美一区二区av| 婷婷色麻豆天堂久久 | 中文字幕亚洲精品专区| 色5月婷婷丁香| 91aial.com中文字幕在线观看| 国产亚洲午夜精品一区二区久久 | 看非洲黑人一级黄片| av在线播放精品| 女的被弄到高潮叫床怎么办| 精品久久久噜噜| 亚洲成人av在线免费| 精品人妻视频免费看| 在线免费观看的www视频| 岛国在线免费视频观看| 中文字幕制服av| 久久久久久久久久成人| 美女大奶头视频| 中文字幕熟女人妻在线| 51国产日韩欧美| 亚洲图色成人| 国产精品一区二区在线观看99 | www.av在线官网国产| 纵有疾风起免费观看全集完整版 | av黄色大香蕉| 成年av动漫网址| 天堂影院成人在线观看| 看黄色毛片网站| 一级黄色大片毛片| 26uuu在线亚洲综合色| 欧美一区二区亚洲| 国产乱来视频区| 国产午夜精品论理片| 国产免费福利视频在线观看| 国产真实伦视频高清在线观看| 成年女人永久免费观看视频| 日韩成人av中文字幕在线观看| 99热全是精品| 日日摸夜夜添夜夜添av毛片| 18+在线观看网站| 欧美激情国产日韩精品一区| 麻豆久久精品国产亚洲av| 精华霜和精华液先用哪个| or卡值多少钱| 国产精品久久久久久精品电影小说 | 在线免费十八禁| 国产又色又爽无遮挡免| 日韩人妻高清精品专区| 变态另类丝袜制服| 插阴视频在线观看视频| 色尼玛亚洲综合影院| 女人久久www免费人成看片 | 国产精品久久久久久久久免| 国产伦一二天堂av在线观看| 免费看a级黄色片| 久久久久久大精品| 秋霞在线观看毛片| 精品久久久久久久人妻蜜臀av| 在线天堂最新版资源| 七月丁香在线播放| 男人狂女人下面高潮的视频| 成人毛片a级毛片在线播放| 我要搜黄色片| 久久这里有精品视频免费| 国产 一区精品| 亚洲精品国产av成人精品| 成人午夜精彩视频在线观看| 亚洲av.av天堂| 午夜激情欧美在线| 免费观看在线日韩| 一二三四中文在线观看免费高清| 国产在线男女| 我要搜黄色片| 免费一级毛片在线播放高清视频| 亚洲精品成人久久久久久| 亚洲av免费高清在线观看| 欧美日本视频| 日本wwww免费看| 久久久国产成人免费| 精品少妇黑人巨大在线播放 | 久久久久久久久中文| 久久亚洲精品不卡| 久久99热6这里只有精品| 九九在线视频观看精品| 一区二区三区四区激情视频| 国产在线男女| 亚洲18禁久久av| 久久精品久久久久久久性| 插逼视频在线观看| 久久精品久久精品一区二区三区| 国产午夜精品一二区理论片| 国产高清视频在线观看网站| 免费黄网站久久成人精品| 久久热精品热| 午夜老司机福利剧场| 91精品一卡2卡3卡4卡| 国产私拍福利视频在线观看| 亚洲av电影不卡..在线观看| 亚洲精品久久久久久婷婷小说 | 成人高潮视频无遮挡免费网站| 欧美精品国产亚洲| 久久精品国产亚洲av天美| 日韩人妻高清精品专区| 2021少妇久久久久久久久久久| 亚洲国产日韩欧美精品在线观看| 亚洲av成人av| 老师上课跳d突然被开到最大视频| 久久热精品热| 在现免费观看毛片| 岛国毛片在线播放| 超碰av人人做人人爽久久| 国产高清三级在线| 国内揄拍国产精品人妻在线| 国产精品国产三级国产专区5o | 三级国产精品片| 亚洲中文字幕日韩| 亚洲国产精品成人综合色| 日韩一区二区视频免费看| 国产伦精品一区二区三区视频9| 一级毛片我不卡| 别揉我奶头 嗯啊视频| 日韩一本色道免费dvd| 一区二区三区四区激情视频| 亚洲成av人片在线播放无| 在线观看一区二区三区| 一级毛片我不卡| 赤兔流量卡办理| 亚洲欧美精品专区久久| 女人十人毛片免费观看3o分钟| 国产淫语在线视频| 亚洲av.av天堂| 亚洲一区高清亚洲精品| 久久精品人妻少妇| 最近手机中文字幕大全| 老师上课跳d突然被开到最大视频| 亚洲电影在线观看av| АⅤ资源中文在线天堂| 色视频www国产| 亚洲av日韩在线播放| 色播亚洲综合网| 97超碰精品成人国产| 国产精品国产三级国产av玫瑰| 免费看日本二区| 久久久久久九九精品二区国产| 国产精品久久视频播放| 女人久久www免费人成看片 | 国产熟女欧美一区二区| 亚洲精品影视一区二区三区av| 中文字幕av在线有码专区| 99热网站在线观看| 国产久久久一区二区三区| 人人妻人人澡欧美一区二区| 国产精品久久久久久精品电影小说 | 久久久精品欧美日韩精品| 国产大屁股一区二区在线视频| 中文欧美无线码| 在现免费观看毛片| 免费观看在线日韩| 国产精品久久视频播放| 国产精品国产高清国产av| 中文字幕熟女人妻在线| 赤兔流量卡办理| 日韩三级伦理在线观看| 国产亚洲av片在线观看秒播厂 | 亚洲熟妇中文字幕五十中出| 十八禁国产超污无遮挡网站| 日韩制服骚丝袜av| 欧美极品一区二区三区四区| 国产三级在线视频| 午夜激情欧美在线| 久久久亚洲精品成人影院| 性色avwww在线观看| www.av在线官网国产| or卡值多少钱| 国产成人午夜福利电影在线观看| 丰满乱子伦码专区| 一级毛片aaaaaa免费看小| 我的女老师完整版在线观看| 九九爱精品视频在线观看| 中文欧美无线码| 日韩 亚洲 欧美在线| av.在线天堂| 嫩草影院入口| 少妇高潮的动态图| 丝袜美腿在线中文| 欧美不卡视频在线免费观看| av在线播放精品| av在线亚洲专区| 久久久久久久午夜电影| 成人欧美大片| 国产国拍精品亚洲av在线观看| 精品人妻一区二区三区麻豆| 亚洲怡红院男人天堂| 国产精品蜜桃在线观看| 国产一区二区三区av在线| 午夜激情福利司机影院| 国产乱人视频| 国产精品一区www在线观看| 成年版毛片免费区| 久久久精品欧美日韩精品| 99热这里只有精品一区| 国产免费福利视频在线观看| 一级毛片我不卡| 亚洲色图av天堂| 日本爱情动作片www.在线观看| 国产亚洲精品av在线| 久久久久久久午夜电影| 小蜜桃在线观看免费完整版高清| 国产精品三级大全| 日韩一区二区视频免费看| 国产男人的电影天堂91| 久久久精品大字幕| 长腿黑丝高跟| 中文字幕久久专区| 国产国拍精品亚洲av在线观看| 日本一本二区三区精品| av又黄又爽大尺度在线免费看 | 免费观看a级毛片全部| 国产乱来视频区| 男女啪啪激烈高潮av片| 久久精品夜色国产| 婷婷色麻豆天堂久久 | 久久精品久久久久久噜噜老黄 | 日韩欧美三级三区| 观看免费一级毛片| 日韩欧美三级三区| 夜夜爽夜夜爽视频| 变态另类丝袜制服| 国产激情偷乱视频一区二区| 亚洲国产欧美人成| 日韩在线高清观看一区二区三区| 一级毛片电影观看 | 国产精品一及| 午夜久久久久精精品| 国产精品久久久久久精品电影| 欧美成人午夜免费资源| 国产一区二区在线av高清观看| 亚洲在线观看片| 国产精品国产三级国产av玫瑰| 国产在视频线在精品| 亚洲欧美成人综合另类久久久 | 久久精品影院6| 国产色爽女视频免费观看| av免费在线看不卡| 又黄又爽又刺激的免费视频.| 免费电影在线观看免费观看| 亚洲人成网站高清观看| 欧美精品国产亚洲| a级一级毛片免费在线观看| 变态另类丝袜制服| 看黄色毛片网站| 久久久久久大精品| 国产伦一二天堂av在线观看| 日本免费a在线| 国产又色又爽无遮挡免| 一级毛片久久久久久久久女| 国产精品99久久久久久久久| 美女xxoo啪啪120秒动态图| 尾随美女入室| 在线观看美女被高潮喷水网站| 中国美白少妇内射xxxbb| 乱人视频在线观看| 免费看光身美女| 视频中文字幕在线观看| 少妇高潮的动态图| 午夜福利在线观看吧| 精品不卡国产一区二区三区| 亚洲av免费高清在线观看| 大话2 男鬼变身卡| 午夜激情欧美在线| 国产一区二区三区av在线| 久久草成人影院| 欧美bdsm另类| 亚洲精品一区蜜桃| 亚洲国产欧美在线一区| 久久这里有精品视频免费| 欧美性猛交黑人性爽| 少妇的逼水好多| 乱人视频在线观看| 黄色配什么色好看| 精品久久久久久久末码| 免费观看在线日韩| 国产黄色视频一区二区在线观看 | 国产久久久一区二区三区| 精品久久久久久电影网 | 久久久精品欧美日韩精品| 国产午夜精品论理片| 精品免费久久久久久久清纯| 国产黄片美女视频| 亚洲四区av| 成人毛片60女人毛片免费| 亚洲人与动物交配视频| 亚洲成人精品中文字幕电影| 欧美极品一区二区三区四区| 国产私拍福利视频在线观看| 可以在线观看毛片的网站| 亚洲国产欧美人成| 久久久久久久久久成人| 国产精品久久久久久av不卡| 高清午夜精品一区二区三区| 中文乱码字字幕精品一区二区三区 | 亚洲成人精品中文字幕电影| 99热这里只有精品一区| 亚州av有码| 欧美97在线视频| 国国产精品蜜臀av免费| 欧美人与善性xxx| 日日摸夜夜添夜夜爱| 国语自产精品视频在线第100页| 日韩欧美精品免费久久| 国产免费一级a男人的天堂| 能在线免费观看的黄片| 亚洲自拍偷在线| 欧美极品一区二区三区四区| 美女黄网站色视频| 国产伦精品一区二区三区四那| 舔av片在线| 欧美激情国产日韩精品一区| 18禁在线无遮挡免费观看视频| 国产熟女欧美一区二区| 国产黄色视频一区二区在线观看 | 中文在线观看免费www的网站| 国产精品国产三级国产av玫瑰| 大又大粗又爽又黄少妇毛片口| 国产一区二区亚洲精品在线观看| 特级一级黄色大片| 嫩草影院入口| 国产女主播在线喷水免费视频网站 | 婷婷色av中文字幕| 亚洲av成人精品一二三区| 亚洲天堂国产精品一区在线| 男女边吃奶边做爰视频| 不卡视频在线观看欧美| 一区二区三区免费毛片| 在线天堂最新版资源| 成人性生交大片免费视频hd| 久久99精品国语久久久| 久久精品国产亚洲av天美| 国产私拍福利视频在线观看| 一级毛片aaaaaa免费看小| 久久久久久大精品| 岛国毛片在线播放| 大香蕉97超碰在线| 亚洲av中文av极速乱| 国产中年淑女户外野战色| 国产一区二区三区av在线| 蜜桃亚洲精品一区二区三区| 99热网站在线观看| 男女那种视频在线观看| 嫩草影院精品99| 人妻制服诱惑在线中文字幕| 国产亚洲av嫩草精品影院| 22中文网久久字幕| 欧美成人免费av一区二区三区| 免费看av在线观看网站| 国产v大片淫在线免费观看| www.色视频.com| 亚洲在线观看片| 舔av片在线| 黄色一级大片看看| 国内精品宾馆在线| 亚洲乱码一区二区免费版| 亚洲精品乱码久久久v下载方式| 亚洲自偷自拍三级| 91狼人影院| 嫩草影院入口| 日韩人妻高清精品专区| 成人亚洲精品av一区二区| 亚州av有码| 97超碰精品成人国产| 免费搜索国产男女视频| 精品一区二区三区人妻视频| 美女高潮的动态| 亚洲伊人久久精品综合 | 少妇丰满av| 丝袜喷水一区| 欧美性感艳星| 老司机影院毛片| 精品一区二区三区视频在线| 国产一区二区三区av在线| 级片在线观看| 噜噜噜噜噜久久久久久91| www.av在线官网国产| 一本一本综合久久| 亚洲av中文av极速乱| 免费黄网站久久成人精品| 久久精品影院6| 久久精品国产亚洲av涩爱| 免费搜索国产男女视频| 国产精品综合久久久久久久免费| 免费看日本二区| 亚洲精品乱久久久久久| 国产伦一二天堂av在线观看| 亚洲综合精品二区| 日本五十路高清| 国产乱人偷精品视频| 国产精品日韩av在线免费观看| 国产精品乱码一区二三区的特点| 春色校园在线视频观看| 黄片无遮挡物在线观看| videos熟女内射| 国产高清国产精品国产三级 | 国产人妻一区二区三区在| 亚洲自拍偷在线| 天堂影院成人在线观看| 我要看日韩黄色一级片| 美女脱内裤让男人舔精品视频| 99热6这里只有精品| 色噜噜av男人的天堂激情| 亚洲在线观看片| 搡老妇女老女人老熟妇| 亚洲中文字幕一区二区三区有码在线看| 日韩一区二区三区影片| www.色视频.com| 18+在线观看网站| videossex国产| 神马国产精品三级电影在线观看| 国产精品永久免费网站| 色综合站精品国产| 国语自产精品视频在线第100页| 乱系列少妇在线播放| 成人国产麻豆网| 成人一区二区视频在线观看| 女人十人毛片免费观看3o分钟| 国产高潮美女av| 国产精品久久久久久久久免| 天堂√8在线中文| 国产伦一二天堂av在线观看| 国内精品宾馆在线| www日本黄色视频网| 啦啦啦韩国在线观看视频| 久久6这里有精品| 男的添女的下面高潮视频| 99热这里只有精品一区| 男女边吃奶边做爰视频| 欧美最新免费一区二区三区| 久久久久性生活片| 97超视频在线观看视频| 国产伦在线观看视频一区| 日本一二三区视频观看| 美女大奶头视频| 啦啦啦韩国在线观看视频| 色播亚洲综合网| 全区人妻精品视频| 久久久久性生活片| 草草在线视频免费看| 精品久久久久久久人妻蜜臀av| 亚洲精品影视一区二区三区av| 久久久午夜欧美精品| 亚洲精品一区蜜桃| 欧美一区二区亚洲| 成年女人看的毛片在线观看| 久久久国产成人精品二区| 国产亚洲91精品色在线| 尾随美女入室| 在现免费观看毛片| 日韩精品有码人妻一区| 免费av观看视频| 久久久久九九精品影院| av免费观看日本| 99久久精品一区二区三区| 欧美xxxx黑人xx丫x性爽| 亚洲真实伦在线观看| 老女人水多毛片| 精品免费久久久久久久清纯| 久久久久久九九精品二区国产| 99热这里只有精品一区| 舔av片在线| 日本黄大片高清| 老司机福利观看| 成人特级av手机在线观看| 亚洲中文字幕日韩| 欧美成人免费av一区二区三区| 久久精品综合一区二区三区| 老司机影院毛片| 免费看av在线观看网站| 精品久久久久久久久av| 国产老妇女一区| 亚洲一区高清亚洲精品| 一边亲一边摸免费视频| 国产黄片美女视频| 麻豆av噜噜一区二区三区| 久久精品影院6| 在线观看66精品国产| 超碰97精品在线观看| 有码 亚洲区| 精品国产露脸久久av麻豆 | av黄色大香蕉| 国产精品女同一区二区软件| 国产免费福利视频在线观看| 国产精品久久电影中文字幕| 淫秽高清视频在线观看| 天堂中文最新版在线下载 | 亚洲婷婷狠狠爱综合网| 亚洲自拍偷在线| 日韩一区二区视频免费看| 亚洲va在线va天堂va国产| 婷婷色综合大香蕉| 午夜免费男女啪啪视频观看|