• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Design of Authoring Tool for Static and Dynamic Projection Mapping

    2021-12-14 03:47:54SangJoonKimNammeeMoonMinHongGoomanParkandYooJooChoi
    Computers Materials&Continua 2021年1期

    Sang-Joon Kim,Nammee Moon,Min Hong,Gooman Park and Yoo-Joo Choi

    1Seoul National University of Science and Technology,172 Gongneung-dong,Nowon-gu,Seoul,Korea

    2Hoseo University 165 Sechul-ri,Baebang-eup,Asan-si,Chungcheongnam-do,Korea

    3Soonchunhyang University,336-745,Asan-si,Chungcheongnam-do,Korea

    4Seoul Media Institute Technology,661 Deungchon-dong,Gangseo-gu,Seoul,Korea

    Abstract:This study introduces the design details of a tool to create interactive projection-mapping content in a convenient manner.For the proposed tool design,a homography-based camera–projector calibration method was applied with the use of red–green–blue-depth images from a Kinect V2 sensor that did not require accurate camera calibration prerequisites.In addition,the proposed tool simultaneously achieved static projection mapping that projected the image content onto a fixed object,and dynamic projection mapping that projected the image content onto a user’s body,by tracing the moving user.To verify the effectiveness of the proposed content-creation tool,users with no programming capabilities were employed to create contents that were projected onto various objects in fixed positions and a user’s body in various poses,thereby analyzing the tool’s completeness.Moreover,the projection accuracy was analyzed at different depth positions,and the projection-mapping accuracy was verified with the use of the proposed method.

    Keywords:Dynamic projection mapping;camera–projector calibration;Kinect projection mapping;content authoring tool

    1 Introduction

    The evolution of augmented reality(AR)technology has led to its widespread use in various fields,e.g.,medicine,industry,education,mobile games,and entertainment[1–3].AR employs a computer graphic technique that uses an integrated display that matches an object in the real world to a different object in a virtual world[4].There are multiple AR variants.For example,one variant may show the augmented information on a monitor or a smartphone.Another variant may display the augmented information via a head-mounted display(HMD)or glasses[5,6].These AR variants have disadvantages.First of all,they only allow a single user to experience a virtual world.Second,they restrict the space within which the user can move and motion owing to the device.Third,they easily cause diminished concentration,nausea,or dizziness.

    Therefore,interest in spatial augmented reality(SAR)has been increasing.SAR is a projection-mapping technique that enables multiple users to experience virtual content simultaneously,without the need to wear specific gear.SAR enhances visual information by projecting images onto the surfaces of real threedimensional(3D)objects and spaces[7].The projection-mapping technique is extensively applied in the field of media arts to show colorful displays,e.g.,advertisements,exhibitions,and performances.Moreover,dynamic projection mapping that traces and projects onto a moving object,has been highlighted extensively compared with the static version that is projected onto a still object[8–11].

    However,creating the dynamic projection-mapping content can be challenging,as explained below.First,a time-consuming calibration is required between the projector and the sensor(e.g.,a camera)used for tracking moving objects.In addition,an accurate camera calibration must be conducted before the camera-projector calibration.Second,the calibration error may increase as the distance between the projector and the camera used for tracking increases.Third,unlike the static projection-mapping content,which can be easily created by a user,dynamic projection-mapping content requires a complicated programming process because there are no commercialized content-creation tools.Owing to these problems,it is difficult for media producers and artists who have little knowledge of programming to create dynamic projection-mapping contents without any other assistance.

    As such,this study proposes an authoring tool that can dynamically project the desired image contents onto a body by tracking the user’s skeleton information without the use of any complicated programming process.The proposed tool employs a homography-based camera–projector calibration method that uses depth information but does not require accurate camera calibration.In addition,it uses text-based configuration files to allow nontechnical users to create projection-mapping content without any other assistance.The tool executes static and dynamic projection mappings.

    To verify the effectiveness of the proposed content-creation tool,users with no programming experience were employed.These users were asked to create a static projection-mapping content for mapping onto a fixed object,and dynamic projection mapping content for mapping onto the body of a moving user within a specified amount of time.In addition,the projection-mapping accuracy was analyzed comparatively based on users in motion at different depth levels.The proposed tool is expected to be extensively applied in stage art fields that require various stage effects,and in public relations.

    The rest of this study is organized as follows.Section 2 analyzes the projection mapping process,the camera–projector calibration,and the current related methods.Further,it investigates the characteristics of commercial projection-mapping software.Section 3 describes the design details of the proposed tool.Section 4 provides the details of the experimental implementation and presents the experimental results.Section 5 discusses the limitations and future work of the proposed tool,and outlines the conclusions.

    2 Related Work

    2.1 Classification of Projection-Mapping Techniques

    Projection-mapping methods are classified into static and dynamic.Static projection mapping implies that the shape or position of the target object is fixed.In static projection mapping,if the target object moves,the projection direction must be manually adjusted.This mapping is applied in fa?ade work[12]that performs projection mappings in large buildings,merchandise advertisements[13],and media arts[14].

    Dynamic projection mapping is when the position or shape of a projection target object varies.The method traces the position or shape of a target object with various sensors,and maps the virtual content to the real content according to the traced information.The target object may include a dynamically moving cloth,the human body,a face,or a general object.It is more difficult to implement dynamic than static projection mapping.Additionally,a complicated calibration process is required in advance.However,it can increase the audiences’interest and immersion.

    Siegl et al.[15]presented a method that traces a white-colored object(the statue of Agrippa)with an RGB-D camera,and generated projection maps onto the object from different angles.Zhou et al.[16]presented a method that traced and mapped a real object with projection mapping on a movable object system(Pmomo),which was based on the Kinect V2 RGB-D camera.Narita et al.proposed a technique that traced the motions of a deformable invisible marker,which was printed by infrared(IR)ink with the use of a high-speed projector and camera.The same authors also analyzed a T-shirt’s motion with the use of an invisible marker which was attached to it.They projected appropriately the deformed image content onto the image of the T-shirt according to the analyzed information[8].

    Lee et al.[9]presented a mapping method that projected maps onto the costume of an actor who moved in real time.This method constructed a mask based on the actor’s regional information to select a twodimensional(2D)video,and applied this to the projection.Existing projection-mapping techniques use high-priced equipment,or applied simple 2D masks which are constructed by extracting the area occupied by the user from a camera image.Three-dimensional(3D)spatial information was not utilized in this case.

    2.2 Camera–Projector Calibration

    Acquiring a picture of an actual 3D space with a camera produces a 2D image.This 2D image determines the position and direction of the camera according to the camera’s characteristics.Herein,the intrinsic camera parameters refer to parameters that define the characteristics of the camera itself.Extrinsic camera parameters refer to the parameters that correspond to the camera’s position and direction.To predict how the points in 3D space are projected onto 2D camera images,and to inversely restore the 3D coordinates of the observed point from the 2D camera images,the conversion relationship,composed of the intrinsic and extrinsic camera parameters,must be determined.The process used to identify the intrinsic and extrinsic camera parameters is referred to as the camera calibration process.

    Various methods have been presented to calculate the camera’s intrinsic and extrinsic parameters.Abdel-Aziz et al.[17]proposed the direct linear transformation(DLT)that defines the conversion between the 3D world and 2D camera coordinates.Zhang[18]presented a method that extractsnextrinsic and intrinsic calibration parameters fromnposes based on a method that uses feature points in conjunction with three poses or more.

    A projector is needed to display AR in a large-scale facility.However,a balance between practicality and accuracy is necessary to precisely adjust a projector with a long focal distance.Projector calibration determines the projector’s intrinsic and extrinsic parameters in a manner similar to the camera calibration process.Similar to the case of a camera,the projector’s intrinsic parameters indicate its focal length,image center,and lens deformation status,and the extrinsic parameters indicate the projector’s position and direction.

    Currently,there are two projector–calibration methods,namely,pin-hole model-based calibration[19,20]and homography-based camera–projector calibration[21,22].The pin-hole-model–based calibration method considers a projector as an inverse camera and performs projector calibration by applying the Zhang’s camera calibration method[18].As shown in Fig.1,the calibration of the camera is performed using a printed calibration board.In other words,a conversion matrix is obtained.Another calibration board is projected by a projector and the projected calibration board is photographed with a camera.Subsequently,the 3D spatial coordinates of the projected calibration board are estimated with a conversion matrix(H,in Fig.1).The projector’s intrinsic and extrinsic parameters are determined based on the estimated 3D spatial coordinates and the projection-screen coordinates.

    Figure 1:Pin-hole model-based camera–projector calibration model

    The homography calibration method projects a calibration board onto a screen,a wall surface,or a white board with a projector.The projected image is obtained again with a camera and is saved as a 2D image.The calibration is performed after the determination of the homography relationship between the saved camera image and the projected image.

    The camera–projector calibration method based on the pin-hole model requires an accurate camera calibration in advance.Conversely,homography calibration encounters difficulties in achieving accurate projections onto objects that are positioned at various depths.Therefore,this study applied the homography calibration approach which does not require accurate camera calibration,and devised a method that can a)perform appropriate projection mapping by extracting homographies at various depth values and b)selectively apply the homography according to the depth value of the target object.

    3 Proposed Authoring Tool

    3.1 Overview of the Proposed Tool

    The projection mapping content authoring tool proposed in this study enables both static projection mapping onto a fixed object and dynamic projection mapping onto a moving person based on tracking without the need of any programming processes.It builds on the simple video mapping tool(SVMT)[23]that supports only static projection mapping and implements human body tracking-based dynamic projection mapping.Fig.2 shows the four modules of the proposed tool.The first module,the content configuration reading module,reads a content configuration file saved in a text form to load video clips or image files required for projection mapping,and generates polygonal mesh frames according to the frame properties defined in the file.The texture property of polygonal mesh frames defines and renders video clips and image files for projection mapping,and maps them onto the initial positions of the polygonal mesh frames.It also lets the user manually assign the positions and detailed shapes of the polygonal mesh frames.The second module,the camera–projector calibration module,computes and saves the homography between camera–projectors with different camera depth locations.The third module performs static projection mapping.It renders polygonal mesh frames to specific positions according to the properties defined in the content configuration file and the manual adjustments made by the user.The fourth module tracks a moving user and projects the desired media contents onto the human body.It tracks user movements with the Kinect V2 system that traces 25 human body joints,and computes and renders the positions of the polygonal mesh frames that match the user’s body with the homography computed in the camera–projector calibration module.The next section provides elaborate details on the module.

    Figure 2:Schematic overview of the proposed tool

    3.2 Content Initialization Based on Content Configuration File

    The properties of the polygonal mesh frames used for projection mapping and the media files(video clip and image file)used for mesh frame texture were defined in a content configuration file.They have the same configuration as SVMT[23]with two differences:1)the six frames(Frame 1 to Frame 6)are designated exclusively for dynamic projection mapping,and 2)the frame that matches a fixed object starts from Frame 7.As a result,it can simultaneously track six human bodies and perform dynamic projection mapping.The initial position of each frame is set to predefined values.A user defines video clips and image files for a desired projection mapping content and selects the number of frames.For each frame,video and image numbers are assigned for mapping.These configurations are saved in a content configuration file according to which the proposed tool loads the video and image files,and generates polygonal mesh frames that are rendered onto the monitor.Using the mouse,each frame is moved to a desired position and the shape of the polygonal mesh frames are manually adjusted to match the shape of the target object for projection.

    3.3 Homography-Based Camera–Projector Calibration Module

    The calibration module projects the chessboard image on a board arranged at regular intervals to calibrate the camera and projector.The camera is then used to capture the projected image and identify the match between the captured image and the display screen image,as shown in Fig.3.It stores the camera image coordinates,depth coordinates,and the screen image coordinates for each point on the chessboard.

    Figure 3:Homography between the captured camera and the display images

    Eq.(1)represents the homography between the captured camera image and the projected screen image.

    where[x,y]and[x’,y’]denote the pixel coordinates of the captured camera image and the matched screen image,respectively.Fig.3 shows the display screen image and projected chessboard on the moving white board.We manually adjusted four vertices of the chessboard on the screen image and projected the deformed chessboard onto the moving white board.After the white board was moved,we manually readjusted the vertices of the chessboard on the screen image and projected it again.The projected chessboard was captured with the use of the camera.Fig.4 shows a white board at the same orthogonal distance from the camera.Fig.4a shows the chessboard projected by placing the whiteboard on the right and Fig.4b shows the chessboard projected by placing the whiteboard on the left.

    Figure 4:Projection of a chessboard on a moving board

    Figure 5:Screen coordinates of the 54 corner points(red)and the four vertices(blue)of a 9×6 chessboard

    Figure 6:Fifty-six corner points of the chessboard automatically extracted from the image using the OpenCV library

    3.4 Dynamic Projection Mapping According to Human Movement

    The dynamic mapping module is intended for the mapping of a video or an image onto the body of the user.It first checks to ascertain if there is a homography that corresponds to the depth level of the corner point stored in the calibration module.If there is no saved homography information,the dynamic mapping module does not work.If there is information,the average depth values of the corner points are obtained at each depth level.The average depth values of the corner points are used to select the proper homography for the joint points.To correctly map each joint tracked in Kinect,the joint coordinates and joint depth information need to be known.Kinect uses infrared sensors to track joints.Thus,the coordinates of the joints are aligned to the frame of the infrared sensor.Thus,the depth sensors,such as the infrared sensor resolution,can obtain the depth information without any additional work.However,corner point coordinates and the homography used in this study were calculated in response to the resolution of the color camera.Therefore,to use joint coordinates,joint coordinates corresponding to color image resolution should be identified.To achieve this,we used a utility called coordinate mapper to obtain the color image coordinates that corresponded to the joint coordinates of the infrared sensor.Additionally,as shown in Fig.8,the homography for each joint is selected from the setby comparing the average depth value of the corner points with the depth value of the joint.Furthermore,by identifying the nearest depth level from each joint point.The screen coordinates of the joint are calculated with Eq.(1).In the proposed system,the calculated screen coordinates of the left/right shoulder and the left/right pelvis are set to the coordinates of the four corner points of the video or image specified in the configuration file,as shown in Fig.9.

    Figure 7:Homography according to the position of the chessboard placed in different three-dimensional(3D)spaces

    Figure 8:Selected homography based on the estimation of the depth level that corresponds to the joint depth

    Figure 9:Mapping relation between the rectangular points of the image and joint points

    4 Implementation and Experimental Results

    4.1 Implementation Environment

    In this study,we installed the Kinect V2 supporting infrared sensor,depth sensor,color camera,and BenQ MW846UST full high-definition(HD)3D projector with 3000 ANSI.The Kinect sensor and the projector are installed at a distance,as shown in Fig.10(right).Additionally,a moving whiteboard was used for camera–projector calibration,as shown in Fig.10(left).For the experiments,the Kinect sensor and the projector were placed on the front of a moving whiteboard,and three matrices of homography were calculated at three depth levels in an area of 3 m × 3 m.The corner points of the chessboard were automatically detected in the camera image and the screen coordinates of the detected corner points were computed with the selected homography and Eq.(1).

    Figure 10:Movable whiteboard(left)and projector/Kinect setup(right)

    4.2 Dynamic Projection Mapping for Single User

    The first experiment mapped a red circle,an image,and a video to the user’s body with the use of the joint coordinates of the left/right shoulder,left/right pelvis,and both user hands.Fig.11 shows mappings on the body with the use of red circles and videos.Fig.11a is a picture showing one hand in the upward position and one in the downward position,and Fig.11b is the picture showing both hands raised.

    Figure 11:Mapping red circles and video onto a single user’s body.(a)A picture with one hand up and one hand down.(b)A picture with both hands up

    4.3 Dynamic Projection Mapping for Multiple Users

    The second experiment was conducted to map the red circle and image to the body using the left/right shoulder,left/right pelvis,and hand joints for both hands of multiple users.Fig.12 shows the red circle and the image mapped accurately on the two user bodies.

    Figure 12:Mapping red circles and images onto two the bodies of two users

    4.4 Dynamic Projection Mapping Results by User Distance

    The third experiment mapped a red circle and an image to the user’s body with the use of a single user’s left/right shoulder,left/right pelvis,and the joints of both hands,with the use of the homography selection method according to depth-level classification and one without depth-level classification.Fig.14 shows the application method with one homography and without depth-level classification.Experimentation was conducted with the use of the mapping according to the user’s distance and the movement of both hands.Fig.13a shows the closest distance to the Kinect and the projector.Fig.13b shows the intermediate distance.Fig.13c shows the projection from the longest distance.Fig.13d is an image with both hands lowered,while Fig.13e shows the user with both hands stretched forward.Figs.13b and 13d show images in which the chessboard pattern is appropriately mapped.Figs.13a and 13c depict images in which the chessboard and circles are erroneously mapped.Fig.13e shows a correct mapping of the chessboard and incorrect mappings of the circles on both hands.In Fig.14,the homography selection method was tested based on depth-level classification,and in the same way as the case in which only one homographic method was used.The experimental results show that all the images in Figs.14a–14e map normally,unlike the method that used only one homography.

    Figure 13:Mappings with the use of one homography without depth-level classification.(a)Third depth region.(b)Second depth region.(c)First depth region.(d)An image of the user with both of his hands lowered.(e)An image of the user image with both of his hands stretched forward

    We also developed a 3D wing model in which the position and the orientation were controlled by the left and right shoulder joints.Fig.15 shows the wing model which is displayed from different viewpoints,while Fig.16 depicts the projections of the wing models on the left and right shoulder joints of the user.

    Tab.1 shows the projection errors of the homography selection method based on depth-level classification and the correction outcomes of the method with one homography without depth-level classification.To measure the projection errors,we projected 54 points on the printed chessboard.These 54 projected points refer to the points of the screen space matched to the corner points extracted from the camera image after the printed chessboard was captured with a camera.After capturing the image that included the 54 projected points and the printed chessboard,we measured the distances between the matched points in the captured image.The use of the selection method indicated that the average projection error at each depth level is similar to the projection error at other depth levels.The method which used one homography showed that the average projection error was low only at one depth level,while the average errors at all the other depth levels increased abruptly.Fig.17 shows bar graphs of the projection errors.

    Figure 14:Mapped using homography according to depth-level classification.(a)Third depth region.(b)Second depth region.(c)First depth region.(d)An image of the user image with both of this hands lowered.(e)An image of the user image with both of his hands stretched forward

    Figure 15:Displays of the 3D wing model at different viewpoints

    Figure 16:Dynamic projection of the 3D wing model on the user shoulder joints

    Table 1:Projection errors at different depth levels

    Figure 17:Comparisons of projection errors.(a)Method that selects the homography according to depth.(b)Method that uses only one homography

    5 Conclusions and Future Work

    Projection mapping is one of the AR fields.It is actively used in diverse fields,such as advertising,exhibitions,and performances.Early projection mapping has been used in the outer wall of a building or in a general object in a static form,but with the development of technology,it is possible to project media content in a dynamic form on the body,face,and clothes of a moving person.However,unlike static projection mapping,there are no commercialized tools that can be used for dynamic projection mapping.Therefore,to create dynamic mapping programs,convoluted programming tasks and complex camera-projector calibrations are needed.Correspondingly,in this study,we proposed a dynamic projection mapping content authoring tool that can easily produce dynamic projection mapping contents.The proposed tool uses Kinect’s RGB-D images to design and implement a new camera–projector calibration rather than a complex camera–projector calibration.The camera–projector calibration method implemented herein did not use the printed chessboard.Instead,it projected the chessboard to a moving whiteboard,captured the projected chessboard with the camera,and identified the corner point of the chessboard within the captured image.It then identified the corner point of the chessboard within the display image and stored the coordinate value and depth information.Furthermore,we repeated the method according to each camera depth level to obtain homography.Based on the stored depth information,we averaged the corner points for each depth level and searched for the tracked user joints to compare the depth information and depth values of the tracked joints to identify the depth levels they corresponded to.Based on the use of the homography at the corresponding depth level,either the image or the video specified in the configuration file was mapped to the joint.To prove the accuracy and convenience of the proposed system,we experimented with various single and multiuser poses,and successfully mapped the joints.

    In our future work,we will investigate the robust and natural joint tracking methods at different depth levels.Additionally,we plan to study how IR markers are attached to free-strained objects(typically cloth or paper),track and map them with infrared sensors on the Kinect,and identify ways to minimize mapping delays.

    Funding Statement:This work was partially supported by the Basic Science Research Program through a National Research Foundation of Korea(NRF)grant funded by the Ministry of Education(NRF-2017R1D1A1B03035718),and was partially supported by another National Research Foundation of Korea(NRF)grant funded by the Korean government(MIST)(NRF-2019R1F1A1062752).

    Conflicts of Interest:The authors declare that they have no con icts of interest to report regarding the present study.

    免费在线观看视频国产中文字幕亚洲| 色精品久久人妻99蜜桃| 欧美中文综合在线视频| 色老头精品视频在线观看| 亚洲美女黄片视频| 少妇裸体淫交视频免费看高清 | 99在线人妻在线中文字幕 | 亚洲精品国产色婷婷电影| 一本大道久久a久久精品| 国产欧美日韩一区二区三区在线| 亚洲久久久国产精品| 亚洲精品自拍成人| svipshipincom国产片| 一本大道久久a久久精品| 日韩欧美国产一区二区入口| 大陆偷拍与自拍| 国产极品粉嫩免费观看在线| 国产精品电影一区二区三区 | 高清视频免费观看一区二区| 成人手机av| 成人18禁高潮啪啪吃奶动态图| 99精品在免费线老司机午夜| 亚洲欧美精品综合一区二区三区| 欧美精品高潮呻吟av久久| 亚洲国产看品久久| 香蕉丝袜av| 日韩一卡2卡3卡4卡2021年| 日日爽夜夜爽网站| 好男人电影高清在线观看| 19禁男女啪啪无遮挡网站| 成年人黄色毛片网站| 最近最新中文字幕大全电影3 | 伊人久久大香线蕉亚洲五| 久久久久精品人妻al黑| 精品熟女少妇八av免费久了| 亚洲午夜理论影院| 国产在线观看jvid| 老汉色av国产亚洲站长工具| 免费观看av网站的网址| 美女高潮到喷水免费观看| 国产成人欧美| 国产欧美日韩一区二区三| 亚洲av日韩精品久久久久久密| 中亚洲国语对白在线视频| 黄色片一级片一级黄色片| 欧美日韩亚洲国产一区二区在线观看 | 黄色毛片三级朝国网站| 99国产精品一区二区三区| 欧美日韩av久久| 久热爱精品视频在线9| 黄色 视频免费看| 麻豆乱淫一区二区| 久久婷婷成人综合色麻豆| 午夜两性在线视频| 国产欧美日韩一区二区三区在线| 日韩视频在线欧美| 亚洲国产精品一区二区三区在线| 欧美中文综合在线视频| 不卡一级毛片| 亚洲人成77777在线视频| 精品国产一区二区久久| 老司机深夜福利视频在线观看| 亚洲美女黄片视频| 男女边摸边吃奶| av一本久久久久| 亚洲专区国产一区二区| 欧美午夜高清在线| 久久久国产成人免费| 啦啦啦中文免费视频观看日本| 成人av一区二区三区在线看| 亚洲九九香蕉| 777米奇影视久久| 免费日韩欧美在线观看| 一区二区日韩欧美中文字幕| tocl精华| 91九色精品人成在线观看| 午夜福利视频在线观看免费| 电影成人av| 国产精品欧美亚洲77777| 国产精品98久久久久久宅男小说| 国产精品二区激情视频| 99热国产这里只有精品6| 一区二区三区国产精品乱码| 天天躁狠狠躁夜夜躁狠狠躁| 亚洲国产欧美一区二区综合| 精品第一国产精品| 纯流量卡能插随身wifi吗| 国产精品.久久久| 久久这里只有精品19| 91九色精品人成在线观看| 国产一区有黄有色的免费视频| 亚洲七黄色美女视频| 国产亚洲欧美精品永久| 久久久精品区二区三区| 老司机影院毛片| 桃花免费在线播放| 一级a爱视频在线免费观看| kizo精华| 国产成人av教育| 午夜91福利影院| 丝袜喷水一区| 国产精品一区二区精品视频观看| 好男人电影高清在线观看| 黄色视频,在线免费观看| 一本—道久久a久久精品蜜桃钙片| 亚洲自偷自拍图片 自拍| 久久久国产欧美日韩av| 亚洲精华国产精华精| 制服人妻中文乱码| 天天躁夜夜躁狠狠躁躁| 国产福利在线免费观看视频| svipshipincom国产片| 天堂俺去俺来也www色官网| 欧美日本中文国产一区发布| 首页视频小说图片口味搜索| 国产视频一区二区在线看| www.999成人在线观看| 欧美日韩一级在线毛片| 一区二区三区乱码不卡18| 搡老岳熟女国产| 亚洲av欧美aⅴ国产| 久久午夜综合久久蜜桃| 成人av一区二区三区在线看| 在线看a的网站| 在线播放国产精品三级| 91字幕亚洲| 精品国产乱码久久久久久男人| 自拍欧美九色日韩亚洲蝌蚪91| 欧美精品亚洲一区二区| av有码第一页| 国产成人欧美在线观看 | 无遮挡黄片免费观看| cao死你这个sao货| 日日夜夜操网爽| 免费看a级黄色片| 18禁美女被吸乳视频| 丰满人妻熟妇乱又伦精品不卡| 一二三四在线观看免费中文在| 18禁国产床啪视频网站| 99久久国产精品久久久| 一级a爱视频在线免费观看| 精品人妻熟女毛片av久久网站| 欧美日韩视频精品一区| 大片免费播放器 马上看| videos熟女内射| 母亲3免费完整高清在线观看| 国产高清国产精品国产三级| 女人高潮潮喷娇喘18禁视频| 久久免费观看电影| 真人做人爱边吃奶动态| 久久亚洲精品不卡| 国产精品秋霞免费鲁丝片| 超碰成人久久| 19禁男女啪啪无遮挡网站| 桃花免费在线播放| 亚洲精品在线美女| 在线看a的网站| 两个人看的免费小视频| 新久久久久国产一级毛片| 在线看a的网站| 欧美久久黑人一区二区| 午夜精品国产一区二区电影| 性色av乱码一区二区三区2| 亚洲国产欧美在线一区| 热re99久久精品国产66热6| 国产一区二区三区在线臀色熟女 | 三级毛片av免费| 免费在线观看视频国产中文字幕亚洲| 99精品欧美一区二区三区四区| 99热国产这里只有精品6| 在线十欧美十亚洲十日本专区| 亚洲第一青青草原| 十分钟在线观看高清视频www| av免费在线观看网站| 一二三四社区在线视频社区8| 久久久久久亚洲精品国产蜜桃av| 999久久久精品免费观看国产| 精品久久久久久久毛片微露脸| 亚洲avbb在线观看| 欧美 日韩 精品 国产| 天堂中文最新版在线下载| 国产高清国产精品国产三级| av线在线观看网站| 久久精品亚洲av国产电影网| 国产老妇伦熟女老妇高清| 高清视频免费观看一区二区| 欧美日韩亚洲国产一区二区在线观看 | 亚洲自偷自拍图片 自拍| 18禁国产床啪视频网站| 婷婷丁香在线五月| av线在线观看网站| 一本色道久久久久久精品综合| 美女国产高潮福利片在线看| 亚洲精品美女久久久久99蜜臀| e午夜精品久久久久久久| 在线观看免费视频日本深夜| 男女之事视频高清在线观看| 美女福利国产在线| 国产精品免费一区二区三区在线 | 色尼玛亚洲综合影院| 超碰97精品在线观看| 极品教师在线免费播放| 啦啦啦中文免费视频观看日本| 啦啦啦免费观看视频1| 欧美成狂野欧美在线观看| 美国免费a级毛片| 国产91精品成人一区二区三区 | 巨乳人妻的诱惑在线观看| 欧美精品一区二区免费开放| 国产精品1区2区在线观看. | 亚洲欧美一区二区三区黑人| 国产免费视频播放在线视频| 国产主播在线观看一区二区| 一本色道久久久久久精品综合| 日韩一区二区三区影片| 免费观看av网站的网址| 黄片播放在线免费| 欧美黑人精品巨大| 三级毛片av免费| 777久久人妻少妇嫩草av网站| 黄色视频不卡| 在线播放国产精品三级| 日韩欧美国产一区二区入口| 亚洲天堂av无毛| 757午夜福利合集在线观看| 国产精品二区激情视频| 成人18禁在线播放| 欧美av亚洲av综合av国产av| 最黄视频免费看| 精品福利观看| 99国产综合亚洲精品| 国产高清国产精品国产三级| 新久久久久国产一级毛片| 欧美日韩成人在线一区二区| 一边摸一边抽搐一进一出视频| 国产在线视频一区二区| 男女高潮啪啪啪动态图| 亚洲成人免费av在线播放| 青草久久国产| 五月天丁香电影| 欧美日本中文国产一区发布| 久久久久精品国产欧美久久久| 国产男靠女视频免费网站| 日韩大码丰满熟妇| 99九九在线精品视频| 精品人妻在线不人妻| 中文字幕人妻熟女乱码| 成人特级黄色片久久久久久久 | 国产精品电影一区二区三区 | 精品福利永久在线观看| 99九九在线精品视频| 免费在线观看完整版高清| 久久久久视频综合| 蜜桃国产av成人99| 18禁美女被吸乳视频| 色视频在线一区二区三区| 国产日韩欧美在线精品| 亚洲伊人久久精品综合| 999久久久精品免费观看国产| 91av网站免费观看| 日日爽夜夜爽网站| 日本精品一区二区三区蜜桃| av网站在线播放免费| 国产欧美日韩一区二区三区在线| 母亲3免费完整高清在线观看| 亚洲av片天天在线观看| 国产在线免费精品| 最新美女视频免费是黄的| 国产亚洲精品第一综合不卡| 99精品久久久久人妻精品| 又紧又爽又黄一区二区| 亚洲午夜理论影院| 亚洲精品美女久久av网站| 人妻久久中文字幕网| 每晚都被弄得嗷嗷叫到高潮| 超色免费av| 久久精品亚洲熟妇少妇任你| 正在播放国产对白刺激| 91老司机精品| 精品人妻在线不人妻| 18禁黄网站禁片午夜丰满| 亚洲精品一卡2卡三卡4卡5卡| 黑丝袜美女国产一区| 国产日韩一区二区三区精品不卡| 国产av一区二区精品久久| 亚洲国产av新网站| 不卡一级毛片| 国产成人影院久久av| 女人爽到高潮嗷嗷叫在线视频| 91精品三级在线观看| 成人18禁在线播放| 美女午夜性视频免费| 黄色片一级片一级黄色片| 女同久久另类99精品国产91| 夜夜爽天天搞| 一边摸一边抽搐一进一出视频| 日韩熟女老妇一区二区性免费视频| 高清在线国产一区| 久久国产精品男人的天堂亚洲| 国产欧美日韩一区二区精品| 久久久久久免费高清国产稀缺| 电影成人av| 女人爽到高潮嗷嗷叫在线视频| 亚洲成av片中文字幕在线观看| 午夜激情久久久久久久| 乱人伦中国视频| 69精品国产乱码久久久| 精品一区二区三区视频在线观看免费 | 欧美人与性动交α欧美软件| 大香蕉久久网| 丁香六月欧美| 19禁男女啪啪无遮挡网站| 亚洲情色 制服丝袜| 午夜福利欧美成人| 亚洲成国产人片在线观看| 亚洲av成人不卡在线观看播放网| 最黄视频免费看| 久久久久久免费高清国产稀缺| 国产亚洲精品久久久久5区| 久久久久精品国产欧美久久久| 久久久久久免费高清国产稀缺| 五月开心婷婷网| 欧美成人免费av一区二区三区 | 欧美日韩亚洲综合一区二区三区_| 国产成人av教育| 国产亚洲av高清不卡| 黑丝袜美女国产一区| 黄色a级毛片大全视频| 色在线成人网| 中亚洲国语对白在线视频| 十八禁网站免费在线| 国产成人一区二区三区免费视频网站| 亚洲色图av天堂| 美女高潮喷水抽搐中文字幕| 热99re8久久精品国产| 国产一区二区 视频在线| 亚洲专区中文字幕在线| 精品国产乱码久久久久久男人| 亚洲精品国产一区二区精华液| 麻豆成人av在线观看| 高清视频免费观看一区二区| 国产精品秋霞免费鲁丝片| 老熟妇乱子伦视频在线观看| 欧美黑人欧美精品刺激| 精品久久久久久电影网| 成年人午夜在线观看视频| www.999成人在线观看| 十分钟在线观看高清视频www| 麻豆乱淫一区二区| 亚洲精品粉嫩美女一区| 成人三级做爰电影| 欧美日韩黄片免| 国产精品.久久久| 亚洲免费av在线视频| 曰老女人黄片| 热99久久久久精品小说推荐| 大香蕉久久成人网| 欧美精品啪啪一区二区三区| 久久毛片免费看一区二区三区| 1024视频免费在线观看| 欧美中文综合在线视频| 亚洲欧美激情在线| 色94色欧美一区二区| 久久ye,这里只有精品| 一区福利在线观看| 夜夜夜夜夜久久久久| 悠悠久久av| 一级,二级,三级黄色视频| 久久国产亚洲av麻豆专区| 亚洲国产看品久久| 多毛熟女@视频| 国产免费视频播放在线视频| 亚洲av片天天在线观看| 亚洲精品成人av观看孕妇| 亚洲人成77777在线视频| 精品一区二区三区四区五区乱码| 国产在视频线精品| 搡老岳熟女国产| 操出白浆在线播放| 亚洲av成人一区二区三| 亚洲第一av免费看| 国产亚洲精品久久久久5区| 中文字幕最新亚洲高清| 免费黄频网站在线观看国产| 一夜夜www| 色综合婷婷激情| 无遮挡黄片免费观看| 一级,二级,三级黄色视频| 久久国产精品人妻蜜桃| 久久久久视频综合| 成人免费观看视频高清| 叶爱在线成人免费视频播放| 日韩三级视频一区二区三区| 肉色欧美久久久久久久蜜桃| 亚洲国产av影院在线观看| 丝袜美足系列| 啦啦啦免费观看视频1| 欧美日韩中文字幕国产精品一区二区三区 | 99国产综合亚洲精品| 亚洲成国产人片在线观看| 下体分泌物呈黄色| 18禁国产床啪视频网站| 操出白浆在线播放| 中文字幕高清在线视频| 精品亚洲成国产av| 老司机影院毛片| 亚洲国产欧美一区二区综合| 日本wwww免费看| 国产亚洲午夜精品一区二区久久| 在线 av 中文字幕| 亚洲精品久久成人aⅴ小说| 99国产精品免费福利视频| 香蕉丝袜av| 日本wwww免费看| 欧美 亚洲 国产 日韩一| 亚洲美女黄片视频| 精品国产一区二区三区久久久樱花| 日本wwww免费看| a级毛片黄视频| 欧美午夜高清在线| 亚洲熟女精品中文字幕| 久久 成人 亚洲| 国产亚洲精品第一综合不卡| 亚洲精品中文字幕一二三四区 | 嫩草影视91久久| 成人免费观看视频高清| 成人国产av品久久久| 视频区图区小说| 伊人久久大香线蕉亚洲五| 三级毛片av免费| 最黄视频免费看| 亚洲中文字幕日韩| 波多野结衣一区麻豆| 丝袜在线中文字幕| 午夜老司机福利片| av线在线观看网站| 久久精品国产99精品国产亚洲性色 | 国产男女超爽视频在线观看| 久久精品亚洲熟妇少妇任你| 午夜福利免费观看在线| 人人澡人人妻人| 久久久久精品人妻al黑| 亚洲成人手机| 免费观看人在逋| 欧美激情久久久久久爽电影 | 成人av一区二区三区在线看| 国产成人一区二区三区免费视频网站| 男女午夜视频在线观看| 大码成人一级视频| 少妇被粗大的猛进出69影院| 天堂8中文在线网| 岛国毛片在线播放| 美国免费a级毛片| 亚洲七黄色美女视频| 老司机在亚洲福利影院| 大香蕉久久成人网| 日韩欧美国产一区二区入口| 成年女人毛片免费观看观看9 | 国产欧美日韩精品亚洲av| 中文字幕av电影在线播放| 在线天堂中文资源库| 亚洲av片天天在线观看| 亚洲七黄色美女视频| 精品少妇一区二区三区视频日本电影| 国产精品成人在线| 嫩草影视91久久| 精品欧美一区二区三区在线| 久久毛片免费看一区二区三区| 两个人看的免费小视频| 女人久久www免费人成看片| 91九色精品人成在线观看| 亚洲欧美一区二区三区黑人| 变态另类成人亚洲欧美熟女 | 日韩视频在线欧美| 伦理电影免费视频| www.999成人在线观看| 在线观看免费高清a一片| 国内毛片毛片毛片毛片毛片| 下体分泌物呈黄色| 久久久国产成人免费| 国产精品影院久久| 亚洲欧洲日产国产| www.999成人在线观看| 精品国产国语对白av| 久久精品国产亚洲av高清一级| 亚洲专区国产一区二区| 国产熟女午夜一区二区三区| 国产日韩欧美亚洲二区| 国产亚洲午夜精品一区二区久久| 日本黄色日本黄色录像| 国产在视频线精品| 麻豆av在线久日| 久久久久久久国产电影| 狂野欧美激情性xxxx| 精品人妻在线不人妻| 色视频在线一区二区三区| 亚洲性夜色夜夜综合| 色老头精品视频在线观看| 汤姆久久久久久久影院中文字幕| 999久久久国产精品视频| 久久久久国产一级毛片高清牌| 日韩欧美一区二区三区在线观看 | 欧美 亚洲 国产 日韩一| 男女边摸边吃奶| 飞空精品影院首页| av又黄又爽大尺度在线免费看| 欧美变态另类bdsm刘玥| 国精品久久久久久国模美| 精品国产乱码久久久久久男人| 一级黄色大片毛片| 一进一出抽搐动态| 日韩视频在线欧美| 亚洲精品久久午夜乱码| 精品国产乱码久久久久久男人| 国产精品98久久久久久宅男小说| 69精品国产乱码久久久| 日韩视频在线欧美| 精品国内亚洲2022精品成人 | 两性夫妻黄色片| 乱人伦中国视频| 亚洲avbb在线观看| 精品人妻熟女毛片av久久网站| 在线观看舔阴道视频| 一级片免费观看大全| 亚洲国产欧美日韩在线播放| 亚洲国产欧美在线一区| 亚洲精品美女久久久久99蜜臀| 一本大道久久a久久精品| 一区二区三区精品91| 99久久人妻综合| 亚洲国产中文字幕在线视频| 捣出白浆h1v1| 超碰成人久久| 91国产中文字幕| av天堂在线播放| 多毛熟女@视频| 黄色视频不卡| 国产老妇伦熟女老妇高清| 国产色视频综合| tube8黄色片| 国产成人av激情在线播放| 久久中文字幕人妻熟女| 变态另类成人亚洲欧美熟女 | 欧美变态另类bdsm刘玥| 午夜日韩欧美国产| 一级,二级,三级黄色视频| 久久久久久久国产电影| 国产精品欧美亚洲77777| 岛国毛片在线播放| 国产精品一区二区精品视频观看| 精品国产乱码久久久久久小说| 99re在线观看精品视频| 一级毛片精品| 欧美精品一区二区大全| 好男人电影高清在线观看| 欧美国产精品一级二级三级| 91成年电影在线观看| 97人妻天天添夜夜摸| 99热国产这里只有精品6| 国产不卡一卡二| 在线观看免费高清a一片| 最近最新免费中文字幕在线| 伊人久久大香线蕉亚洲五| 久久久精品94久久精品| 露出奶头的视频| 一二三四在线观看免费中文在| bbb黄色大片| 精品人妻在线不人妻| 亚洲国产欧美网| 1024视频免费在线观看| 老熟女久久久| 久久ye,这里只有精品| 老司机影院毛片| 久久久精品区二区三区| 一本—道久久a久久精品蜜桃钙片| 欧美激情 高清一区二区三区| 欧美乱码精品一区二区三区| 桃红色精品国产亚洲av| 色在线成人网| 91老司机精品| 亚洲精品中文字幕一二三四区 | 老司机福利观看| 国产一区二区三区综合在线观看| 女性被躁到高潮视频| 别揉我奶头~嗯~啊~动态视频| 日韩 欧美 亚洲 中文字幕| 精品国内亚洲2022精品成人 | 操出白浆在线播放| 他把我摸到了高潮在线观看 | 一级,二级,三级黄色视频| 新久久久久国产一级毛片| 50天的宝宝边吃奶边哭怎么回事| 亚洲欧美一区二区三区黑人| 18禁观看日本| 搡老乐熟女国产| 日韩欧美免费精品| 另类精品久久| 大片电影免费在线观看免费| 90打野战视频偷拍视频| 国产亚洲一区二区精品| av又黄又爽大尺度在线免费看| 国产三级黄色录像| 国产精品免费视频内射| av片东京热男人的天堂| 亚洲男人天堂网一区| 老司机亚洲免费影院| 精品国产乱码久久久久久小说| 他把我摸到了高潮在线观看 | 国产精品成人在线| av在线播放免费不卡| 在线观看免费高清a一片| 欧美日韩成人在线一区二区| 涩涩av久久男人的天堂| 国产成人av激情在线播放| 亚洲国产欧美一区二区综合|