• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Seamless and non-repetitive 4D texture variation synthesis and real-time rendering for measured optical material behavior

    2019-08-05 01:45:16MartinRitzSimonBreitfelderPedroSantosArjanKuijperandDieterFellner
    Computational Visual Media 2019年2期

    Martin Ritz(), Simon Breitfelder, Pedro Santos, Arjan Kuijper,, and Dieter W.Fellner,,3

    Abstract We show how to overcome the single weakness of an existing fully automatic system for acquisition of spatially varying optical material behavior of real object surfaces. While the expression of spatially varying material behavior with spherical dependence on incoming light as a 4D texture (an ABTF material model) allows flexible mapping onto arbitrary 3D geometry, with photo-realistic rendering and interaction in real time,this very method of texture-like representation exposes it to common problems of texturing, striking in two disadvantages. Firstly, non-seamless textures create visible artifacts at boundaries. Secondly,even a perfectly seamless texture causes repetition artifacts due to their organised placement in large numbers over a 3D surface. We have solved both problems through our novel texture synthesis method that generates a set of seamless texture variations randomly distributed over the surface at shading time. When compared to regular 2D textures, the inter-dimensional coherence of the 4D ABTF material model poses entirely new challenges to texture synthesis, which includes maintaining the consistency of material behavior throughout the 4D space spanned by the spatial image domain and the angular illumination hemisphere. In addition, we tackle the increased memory consumption caused by the numerous variations through a fitting scheme specifically designed to reconstruct the most prominent effects captured in the material model.

    Keywords optical material behavior; reflectance modeling; 4D texture synthesis; texturing

    1 Introduction and related work

    3D rendering research is continuously striving to come closer to a physically realistic representation of realworld surfaces, as it is ever more widely applied to fields where a high degree of realism is required (e.g.,the 3D games industry) or where rapid prototyping is applied in early stages of design (e.g., in the automotive and textile industries). The approach of measuring ABTFs (approximate bi-directional texturing functions) [1, 2] is one way to avoid the need for synthetic design of material models, as it used to be the standard for a long time even though requiring significant manual effort. ABTFs were first proposed by Ref. [1], who used a quarter light arc with incident illumination from point lights, mounted at angles ranging from zero elevation (grazing angles onto the material surface) up to nearly vertically incident light directions. The acquisition of optical material behavior, while already spatially varying due to the matrix sensor of the camera, relied on the assumption of isotropy, meaning that the material sample response is not affected by its rotation around its surface normal. The resulting ABTF data consisted of two spatial dimensions(within the sample surface)and one dimension for light variation,leading to three dimensions in total. Many materials cannot be faithfully represented by assuming isotropy,as they are a combination of many different materials each with different physical structure, and thus reacting differently at each individual surface point to changes of the incoming light direction around the surface normal. Thus, an additional dimension of incoming light direction was added in Ref. [2]. By using a turntable which rotates the sample around its surface normal under the camera, 4D ABTF model results from combining the two lateral dimensions with two angular dimensions describing the hemisphere of incident light directions. Also,the acquisition process was entirely automated and reduced to only a few minutes per sample. Effectively, the combinatorial use of rotary and quarter light arc leads to an illumination hemisphere with the camera at its center,looking vertically down on the sample. The ABTF material model acquired and rendered with the technique in Ref. [2] represents actually captured real-world material behavior as a 4D texture that can be rendered in real time. By only considering one perspective vertically above the sample, it provides an abstraction of the higher dimensional 6D BTF (bidirectional texturing function) [3] that captures the spatially varying material behavior of flat materials,discretized to the resolution of a matrix sensor (2D),for all combinations of incoming light (2D) and outgoing observer direction (2D), while preserving the dependence on the incoming light direction.

    Figure 1 puts the ABTF model within the context of the taxonomy of surface reflectance functions. It is represented by a two-dimensional stack of images(layers), each of which expresses the spatially varying material behavior on the sample surface for one specific incoming light direction. Rendering is done by the shader accessing the captured image stack,according to the angle of the virtual light source in the scene in relation to the surface normal and the current U-V coordinates specified to the shader during rendering of the current fragment, provided by the texture mapping of the 3D geometry to be rendered. The texture-based nature of the ABTF model allows for arbitrary mapping onto any 3D geometry with free choice of scaling and orientation,and the abstraction from multiple observer directions allows the model to fit into current graphics memory for real-time rendering.

    Despite this abstraction, the material model provides compelling results for most materials, while shortcomings only arise where material behavior actually depends on the observer direction, for instance for materials with a “chameleon” effect that change colors depending on the viewing angle.However, the texture-like nature of the model shares common texturing problems,which becomes apparent on two levels. Firstly, if single layers in the 4D texture stack are not synthesized to be seamless,border artifacts become apparent in the form of spatial discontinuities along the seams across texture units. Secondly,even a perfectly synthesized seamless texture causes repetition artifacts that strike the eye when large numbers of identical patches are placed side-by-side over the 3D surface once they are visible within the field of view at the same time. These two unsolved problems are the ones addressed by this work.

    Fig. 1 ABTF model in the surface reflectance function taxonomy introduced in Ref. [4], derived from BTFs and depending on the same variables as surface reflectance functions (SRF), but with different semantics.

    Our solution to both problems is a novel texture synthesis approach that first makes the single ABTF texture layers seamless,and in a second step generates a set of variations for each layer. The patch variations are precomputed and randomly distributed on the surface at shading time, thus completely disrupting any regular patterns and entirely removing repetition artifacts,generating the impression of a non-repeating artifact-free surface (see Fig. 2).

    For the domain of 2D texturing, there are approaches that solve these challenges[5,6]. However,we are dealing with inter-dimensional coherence within a 4D texture,posing the strong requirement of maintaining consistency across the multi-dimensional space of measured material behavior information,represented by the spatial domain and the two dimensions defined by a hemisphere of incoming light directions. This introduces challenges such as specular highlight responses for which continuity must be ensured during realignment of texture sub-regions.Finally, generating multiple variations per ABTF layer increases memory consumption. We counter this problem using a fitting scheme specifically designed to reconstruct the most prominent effects captured by the material model.

    2 Technical approach

    2.1 Texture synthesis and periodization

    The synthesis method for measured ABTF materials developed in this work is based on image quilting[7], which assembles a new texture by transferring small patches from an input image to different new locations within the newly synthesized texture while maintaining optical similarity at the transitions within overlapping regions of neighboring patches.We extended it by adding initial reassembly phases before the original process, ensuring periodicity at the boundary regions of the texture to be synthesized so that it can be seamlessly concatenated border to border in both dimensions. Most importantly, we use the output of the algorithm not just to generate a new patch placement distribution for one texture,but instead for a non-deterministically generated set of patch transfer prescriptions consistently applied to all ABTF data set layers, which guarantees consistency across different incident light angles and provides a random patch placement every time to avoid repetition artifacts.

    The algorithm starts with an empty destination texture matching the input image dimensions. It consists of four phases, visualized in Fig. 3.

    · Phase 1: Patches are placed at all corners to ensure periodicity, which is achieved by choosing a random source patch and splitting it vertically and horizontally through its center for transfer to diagonally opposing corners of the target texture.As the inner patch split edges coincide with the respective outer texture edges, appending the target texture at any edge reunites the original patch, thus leading to artifact-free periodicity for the corner pieces.

    · Phase 2: A similar idea is followed to fill the horizontal edges of the target texture, with two differences. Firstly, the source patch to be transferred is now horizontally split into two halves which then are transferred to vertically opposing edges of the target texture, ensuring periodicity as in Phase 1. Secondly, to avoid deterministic behavior, the patch to be transferred is now chosen at random from a set of best matching patches.Best matches are determined based on the optical similarity between a patch and the existing target neighborhood measured in their common region of overlap, as explained in detail in Section 2.3.

    Fig. 3 Visualization of phases 1-4 of the algorithm. Left: source image. Right: synthesized texture.

    · Phase 3: Steps analogous to Phase 2 are performed to fill the vertical edges while ensuring periodicity.

    · Phase 4: The inner area of the target texture is filled following a row-major traversal scheme. A set of best matching source patches is determined for each empty block, of which one is chosen for transfer at random. Again, as in Phases 2 and 3,optical similarity is used as a metric to establish the most continuous spatial transition across seams within the target patch neighborhood.

    Parameters for this algorithm are: the set size for best matches, the block size of texture patches to be transferred,and the degree of overlap between a patch candidate and its existing destination neighborhood.

    2.2 Visual similarity metric

    The metric used to determine optical similarity to guide the process of finding best matching patches for transfer (see Section 2.3) is a monochrome error image ΔI (see Fig. 5(1), gray scale region), covering exactly the area of overlap currently considered. The intensity for each pixel is computed as the sum of squared differences between source and destination image space for the three color channels evaluated at that pixel position. For 2D textures, the texture is the source image space for this computation,

    while for an ABTF dataset, consistency throughout the 4D texture must be maintained to avoid visual artifacts caused by inconsistencies between texture layers captured with different lighting directions, so a suitable image space must be chosen that represents the entire ABTF dataset.

    In Ref. [8] a BTF dataset is represented by a height map computed from all images captured,representing the approximate geometry of the surface.In addition to the height map, we compute a diffuse map and a reflectance map (see Fig. 4). The three images represent the most relevant aspects of the entire ABTF dataset and are combined by weighted per-pixel summation into one reference image ΔR, serving as source image space. During patch transfer, the impact of the respective diffuse,height, and reflectance component contributions is controlled by weights. Computing ΔI is the most computationally intensive step, as it has to be done for each overlapping region evaluated during the search for best matches. The algorithm described in Section 2.1 has to run in the serial order described,as subsequent patch placement is dependent on previously placed patches. The sole purpose of the reference image R is to serve as basis for generation of a set of transfer prescriptions, which are then applied identically to each texture layer in the ABTF dataset.

    The diffuse map is the normalized sum of all images captured for each lighting direction. The surface geometry is represented by a height map which is computed in two steps. Firstly, a normal map is estimated by evaluating the strongest response for each pixel over all lighting directions. The vectors computed here do not generally represent the surface normal, but the averaged direction to the strongest incoming light contribution. This difference only applies to strongly reflective materials, while there is no difference for perfectly diffuse materials. The height map follows in the second step from integration of the normal map [9]. The reflectance map of the surface is computed by comparing two normal maps weighted with different exponents to find strongly reflective areas on the surface.

    Fig. 4 Left: single example image from the ABTF dataset. Right:reference image components for patch transfer guidance based on optical error, derived from the entire ABTF dataset to represent surface structure and material behavior.

    2.3 Seamless transfer

    Naive copying of rectangular areas leads to highly noticeable edges at the boundaries even if the general structure fits the local neighborhood (see Figs. 5(1)and 5(2)). Image quilting reduces this effect by allowing a free-form boundary cut which minimizes optical differences between source patch and target environment (see Figs. 5(3) and 5(4)). Optical differences are expressed by the error image ΔI introduced in Section 2.2, which at the same time works as a mask, excluding all pixels set to zero(black) from the error metric (see Fig. 5(1)).

    Finding the free-form boundary cut following the path with the minimum cumulative error in ΔI is a path-finding problem which can be solved using Dijkstra's algorithm. The algorithm for finding the free-form boundary cut presented here computes many path candidates in a set of sub-regions of ΔI.Every pixel in a sub-region represents a node in the Dijkstra graph. Interconnections are generated for all adjacent pixels (only horizontally and vertically, not diagonally) using the average error (i.e., intensity value in ΔI) between both pixels as cost. The inner region is masked out (see the black square in the center of Figs. 5(1), 5(3), and 6) as this part of the patch is adopted unchanged and must thus not be crossed by the boundary cut. At first, only the sub-regions at the left, top, right, and bottom edges are considered. All combinations of start and end points on the yellow lines for each of the four highlighted regions are evaluated using Dijkstra's algorithm,leading to the path with minimum cost for the next step(see Figs.6(2)and 6(3)). The endpoints of the selected paths (yellow in Fig. 6(4)) are then connected through the corner-regions which requires only one evaluation of Dijkstra's algorithm per corner.The full cyclic path (see Fig. 6(5)) represents the minimum error boundary cut.

    Fig. 5 Optical error image ΔI (1); naive transfer of rectangular patch exposing visual discontinuities (2); minimum boundary cut through optical error field (3); transfer of boundary region (4).

    Fig. 6 Optical error image ΔI (1); possible minimal paths through the four edge-regions (2); paths with minimal cost (3); the four path segments (endpoints yellow) are connected (4); resulting cyclic boundary cut (5).

    To improve the resulting image quality, the boundary cut is not used as a hard edge between patch content and existing destination neighborhood.Instead, blur masks are created for the areas inside and outside the boundary cut using Gaussian blurring(see Fig.7). The patch content and the neighborhood background are then combined by weighting their respective masks, leading to a smooth transition.

    As the resulting texture is generated iteratively,some regions do not yet contain data and thus do not provide a basis for error computation, e.g., the black region in the example in Fig. 8(left). When placing a patch at the right border region the overlap error can be computed only for the region of overlap in the center of Fig. 8(middle, shown as grayscale error image). As the error for all paths through yet undefined (empty) regions is 0, the decision for the best path is ambiguous and can yield bad results.The red line in Fig. 8(middle, above) represents the path with minimal error, but does not consider the entire viable area within the existing neighborhood,resulting in visible discontinuity artifacts highlighted by the red rings in the resulting texture in Fig.8(right,above). We solve this problem by introducing borderflags for the main directions left, top, right, and bottom, that force the path-finding algorithm to enclose the full region inside the boundary cut towards the directions indicated by the respective flags. The boundary cut in Fig. 8(middle, below) results from setting the top and bottom border-flags, leading to a soft transition between both patches as during computation of the optimal path, the entire border region is considered as visible in Fig. 8(right, below).

    Fig. 7 Boundary cut used to draw a filled polygon defining the weight of the patch content to smooth the transition using Gaussian blur.

    Fig. 8 Border flags controlling the extent of the region considered by visual matching during patch placement. Left: initial coverage with undefined regions on the right. Middle: error image in overlap region (grayscale) and two possible candidates for the boundary cut assigned the same cost in the yet empty areas of the canvas. The upper candidate does not consider the neighboring regions above and below the existing patch, while the lower candidate is aware of the unset regions. Right: difference in results.

    2.4 Variation synthesis and rendering

    Periodic regular textures can be mapped to large surfaces back to back without visible repetition artifacts, but in the case of irregular and stochastic textures, highly noticeable periodic patterns as in Fig. 9(left) appear. This problem can be handled by generating tile sets, within which certain combinations of elements are periodic, and the variation of different combinations can be used to fill a regular grid and thus create a textured surface without visible seams and repetitions. Many tile sets are based on Wang tiles [5] that define different types of edges and compute a tile for every combination of edge types so that there is always a compatible tile for a specific neighborhood of adjacent edges. Only tiles with the same type(types visualized in Fig.10 in different colors)can be placed next to each other to share one edge.The variation synthesis technique introduced in this work generates a corner-based tiling similar to colored corners tiling[10]. A complete tile set created using V corner types yields V4tiles in total. Already using V =2 leads to 16 tiles which removes repetition artifacts,as the comparison in Fig.9 shows. Figure 11 gives a schematic overview of the corner compatibility within a set of 16 tile variations, where compatibility between neighboring tiles is expressed by matching color. Compatibility between neighboring tiles is strongly limited by edge compatibility and is only given for a subset of the tile set.

    Fig. 9 Left: periodic pattern caused by repeating the same texture 10 times in both dimensions. Right: the same tiling rendered with randomly distributed seamless texture variations.

    Fig. 10 Tiles incompatible at common corner despite compatible edges, solved by enforcing common corners.

    A pseudo-random number generator implemented in the shader as a hash function guarantees random selection of variations for mapping on the surface during rendering while keeping the overall mapping on the entire object surface constant without the need to store the chosen assignment of variations. Accessing texture patches at rendering time requires consistency,since texture variations once chosen and mapped to a rendered part of the object must be chosen identically each subsequent time the respective area is rendered.We use a pseudo-random number generator to satisfy this requirement. The generator works similar to a hash function that serves as deterministic, surjective function,assigning always the same function values to the same parameters, but potentially also assigning the same function value to different parameters. This is acceptable as the number of texture variations is limited and has to be reused multiple times across the object surface. The generator accepts an initial seed value that can be used to control the texture variation distribution and easily generate different renderings using the same input texture and 3D model. The available texture patch variations are addressed using U-V mapping, where multiples of the default range[0,1] allow addressing of the virtual array of texture variations (see Fig. 12).

    Fig. 11 Abstract tile set of minimum size based on 2 corner types (red and black), leading to 16 variations, indicating corner compatibilities (above). Example of 2×2 neighborhood construction(below).

    2.5 Fitting

    The smallest possible tile set using V =2 corner types(V =1 is equivalent to a single periodic texture)leads to 24= 16 texture variations in total (see Fig. 11).The total memory consumption depends on the image resolution W ×H and the sampling density of the virtual lighting hemisphere, expressed by R rotations of the sample and E discrete elevations of the light source, and amounts to M =3W ×H×R×E bytes.The data sets used in this work range from 300 MB to 1 GB, which leads to a total memory consumption of 4.7-16 GB including all texture variations. This amount exceeds typically available graphics memory and also would result in poor frame rates because of the lack of memory locality, as the renderer needs to access a widespread range of data when the surface is rendered under varying illumination. The key to realtime processing of large tile sets is thus compression.We use a fitting scheme we designed explicitly for this challenge. We limit the memory consumption for V =2 corner types, or 16 texture variations, per ABTF data set to under 2 GB by fitting an analytic model based on the HSL color space (Fig. 13(right)).The choice of this specific color space for fitting enables reconstruction of the reflectance behavior for each single pixel on the sample surface (Fig. 13(left))without visual degradation.

    Analysis of the reflectance function for different pixels (Fig. 14), transformed into HLS color space,confirms that hue remains mostly constant over the entire spectrum of incident illumination angles as can be seen in Fig. 15 for one specific pixel, evaluated in HLS for the entire 2D illumination space. Similarly,saturation remains mostly unaffected for changing illumination directions. This observation allows us to set hue and saturation constant. The reason why we use the HLS model rather than HSV or similar color models is that the lightness value alone controls the transition from black via color to white, while in other models, this transition requires changing further parameters such as the saturation value.

    Fig. 12 Left: different texture variations Tu,v side by side with shared corners Ci,j. Right: texture patches are addressed using U-V mapping. Different variations are accessible by using multiples of the default range [0,1].

    Fig. 13 Left: visualization of reflectance behavior for one pixel for full angular spectrum (light source elevation θ and rotation angle φ).Right: HSL color spectrum.

    Fig. 14 Reflectance functions for single pixels. Left: image captured at φ = 90°,θ = 79°. Right: corresponding reflectance functions for the two highlighted pixels.

    The lightness value for the two-dimensional lighting direction (azimuth, elevation) is reconstructed using six parameters per azimuth light angle, plus two parameters for hue and saturation, for every pixel.In compressed form the tile sets fit into typically available graphics memory and can thus be rendered in real time.

    Fig. 15 Lightness curves of a pixel for different angles of rotation in HSL color space. Note how hue (red) and saturation (blue) remain almost constant. The deviation in the graph is due to the specular highlight at the respective position.

    A visual comparison for two rendered 3D models mapped with two different physically measured ABTF material samples processed by this method is given in Fig. 16. The renderings on the left were generated based on the original uncompressed ABTF dataset(reference), shown side by side with a rendering using the same settings, but this time evaluating the compressed dataset after applying the fitting scheme described (final output of the algorithm shown).The only perceivable visual difference is a slight discrepancy in hue. The reason for this effect is a consequence of the assumption that for all variations of incoming light directions, the hue channel of the material response expressed in the HLS color model remains mostly constant,which is only approximately the case as can be seen in Fig. 15(red curve).

    Fig. 16 Visual comparison of original (left) and compressed (right)ABTF datasets, captured from two different physical material samples“Ingrain” (above) and “Metal” (below). Both material samples feature a wide spectrum of optical effects like self shadowing and anisotropic reflections, which are successfully reconstructed by the fitting model presented in this work.

    3 Results

    Fig. 17 3D rendering of the ABTF material model. Left: state of the art before this work. Right: our results. Above: plane mapped with 2×2-tiled texture, showing clear discontinuity boundaries at the seams, that are entirely removed by our solution. Center: armchair model mapped with multi-repetition stone texture. Our results do not show any repetition artifacts. Below: teapot model mapped with carpet texture, exposing repetitions of identical textures, while our result seems to consist of one non-repetitive texture only.

    The synthesis method for measured ABTF material models developed in this work is based on image quilting [7] which assembles a new texture by transferring small patches from an input image of identical dimensions to different new locations within the new image while maintaining optical similarity within overlapping regions of neighboring patches. We extended it by adding initial reassembly phases during which the target texture is made periodic at its boundary regions so that it can be seamlessly concatenated border to border in both dimensions. Most importantly, we use the output of the algorithm not just to generate a new patch placement distribution for one texture, but instead for a non-deterministically generated set of patch transfer prescriptions applied to all ABTF data set layers,which both guarantees consistency for different light angles and provides a random patch placement for every texture variation to avoid repetition artifacts. The consequence of using an entire set of texture variations for all different illumination angles is respective increase in memory consumption.Exploiting the fact that only one dimension is significantly affected in HLS color space by changes in the direction of incident illumination, we compensate for the increased memory consumption by a novel fitting scheme. Consequently,the reflectance behavior for each single pixel on the sample surface is reconstructed without visual degradation, while 3D rendering of models mapped with texture variationbased ABTF data sets can still be done in real time.

    4 Conclusions and future work

    We have removed the most significant limitation of an existing 4D texture-based system for acquiring spatially varying optical material behavior of real object surfaces, in that now neither texture seam artifacts nor repetition artifacts disturb the compelling visual experience when rendering the material applied to arbitrary 3D geometries, even if the textures are mapped to large surfaces with many repetitions.

    Not only have we extended current work on 2D texture synthesis by a number of optimizations providing more robustness across the different classes of textures (regular, near-regular, irregular, and stochastic), but we have especially tackled entirely new challenges posed by the higher dimensionality and inter-dimensional correlation of 4D ABTF material texture sets, and compensated for the resulting increase in memory consumption such that 3D rendering can still be done in real time, just like with the previous approach, but without causing texture artifacts.

    As future improvements, we see firstly that the quality of synthesized tile sets can be automatically adjusted according to the occurence of prominent patterns within the texture that draw the observer's attention, since periodicity artifacts depend strongly on the texture class. Secondly, to avoid noisy rendering for very dense tiling due to poor surface sampling, mipmaps adjusted to texture variations can be used that store a texture pyramid with increasing resolution to adaptively react to the sampling resolution.

    Acknowledgements

    This work was partially supported by the European project MAXIMUS(No. FP7-ICT-2007-1-217039)and the German Federal Ministry for Economic Affairs and Energy project CultLab3D (No. 01MT12022E).

    国产久久久一区二区三区| 国产一级毛片在线| 日本一二三区视频观看| 少妇裸体淫交视频免费看高清| 一个人免费在线观看电影| 日日干狠狠操夜夜爽| 听说在线观看完整版免费高清| 色吧在线观看| 老师上课跳d突然被开到最大视频| 日韩高清综合在线| 久久婷婷人人爽人人干人人爱| h日本视频在线播放| 国产av麻豆久久久久久久| 国产又黄又爽又无遮挡在线| 秋霞在线观看毛片| 国产麻豆成人av免费视频| 久久久久久国产a免费观看| 国产精品久久久久久精品电影| 中文字幕久久专区| 国产v大片淫在线免费观看| 国产精品蜜桃在线观看 | 精品一区二区三区视频在线| 一区二区三区免费毛片| av又黄又爽大尺度在线免费看 | 欧美极品一区二区三区四区| 精品久久久久久久末码| 亚洲国产欧美在线一区| 波多野结衣高清作品| 丰满人妻一区二区三区视频av| 久久久久久久久久黄片| av免费在线看不卡| 久久精品国产清高在天天线| 色综合亚洲欧美另类图片| 国产伦精品一区二区三区视频9| 26uuu在线亚洲综合色| 最好的美女福利视频网| 国产极品天堂在线| 国产精品一区二区性色av| 一区二区三区高清视频在线| 欧美日本视频| 色视频www国产| а√天堂www在线а√下载| 波多野结衣高清作品| 国产精品爽爽va在线观看网站| 精品久久国产蜜桃| 最好的美女福利视频网| 你懂的网址亚洲精品在线观看 | 变态另类丝袜制服| 免费av不卡在线播放| 国内精品久久久久精免费| 日韩欧美在线乱码| 亚洲三级黄色毛片| 最近的中文字幕免费完整| 日本免费a在线| 国产不卡一卡二| 夜夜看夜夜爽夜夜摸| 我要搜黄色片| 波多野结衣高清无吗| 赤兔流量卡办理| 毛片女人毛片| 国产精品一区二区三区四区久久| 国产精品久久久久久久电影| 日韩成人伦理影院| 日韩欧美在线乱码| 午夜久久久久精精品| 99久国产av精品国产电影| 成人午夜高清在线视频| 国产亚洲av嫩草精品影院| 激情 狠狠 欧美| 偷拍熟女少妇极品色| 国产视频首页在线观看| 久久久久久国产a免费观看| 97在线视频观看| 久久久久久大精品| 自拍偷自拍亚洲精品老妇| 岛国毛片在线播放| 国产在线精品亚洲第一网站| 欧美xxxx性猛交bbbb| 亚洲av熟女| 一本—道久久a久久精品蜜桃钙片 精品乱码久久久久久99久播 | 久久99热这里只有精品18| 夜夜看夜夜爽夜夜摸| 直男gayav资源| 99久久精品国产国产毛片| 乱人视频在线观看| 亚洲欧洲国产日韩| 亚洲一区二区三区色噜噜| 国产伦精品一区二区三区四那| 可以在线观看毛片的网站| 看片在线看免费视频| 亚洲精品456在线播放app| 少妇猛男粗大的猛烈进出视频 | 欧美激情国产日韩精品一区| 亚洲精品456在线播放app| 日韩欧美精品免费久久| 男女边吃奶边做爰视频| www日本黄色视频网| 成年女人看的毛片在线观看| 可以在线观看毛片的网站| 国产v大片淫在线免费观看| 性色avwww在线观看| 99在线人妻在线中文字幕| 国产亚洲5aaaaa淫片| 免费无遮挡裸体视频| 国产 一区 欧美 日韩| 最近最新中文字幕大全电影3| 99在线视频只有这里精品首页| 听说在线观看完整版免费高清| avwww免费| 国产高潮美女av| 国产精品99久久久久久久久| 精品人妻一区二区三区麻豆| 国产麻豆成人av免费视频| 嫩草影院入口| av在线天堂中文字幕| 国产私拍福利视频在线观看| 亚洲精品久久久久久婷婷小说 | 精品久久久久久久末码| 久久精品综合一区二区三区| 国产在视频线在精品| 能在线免费看毛片的网站| 国产人妻一区二区三区在| 两个人视频免费观看高清| 日韩欧美三级三区| 日本五十路高清| 婷婷精品国产亚洲av| 国产 一区 欧美 日韩| 国产视频内射| 国产精华一区二区三区| 中国国产av一级| 国产成人91sexporn| 中文字幕av成人在线电影| 色综合色国产| 99精品在免费线老司机午夜| 女人被狂操c到高潮| 亚洲最大成人手机在线| 丰满乱子伦码专区| 边亲边吃奶的免费视频| 色综合色国产| 看十八女毛片水多多多| 国产伦在线观看视频一区| 又黄又爽又刺激的免费视频.| 色视频www国产| 久久午夜亚洲精品久久| 嫩草影院新地址| 国产精品三级大全| 26uuu在线亚洲综合色| 亚洲成人久久性| 大香蕉久久网| 国产精品人妻久久久久久| 中文字幕人妻熟人妻熟丝袜美| 精品欧美国产一区二区三| 欧美日韩乱码在线| 国产黄a三级三级三级人| 欧美bdsm另类| 欧美日韩综合久久久久久| 小蜜桃在线观看免费完整版高清| 欧洲精品卡2卡3卡4卡5卡区| 久久久久久久亚洲中文字幕| 干丝袜人妻中文字幕| 精品欧美国产一区二区三| 少妇猛男粗大的猛烈进出视频 | 久久精品91蜜桃| 国内精品一区二区在线观看| 精品久久久久久久人妻蜜臀av| 精品久久久久久久久久久久久| 性欧美人与动物交配| 麻豆成人午夜福利视频| 国产精品久久久久久av不卡| 日韩强制内射视频| 亚洲av免费在线观看| 久久99精品国语久久久| 国产免费一级a男人的天堂| 欧美最新免费一区二区三区| 国产伦在线观看视频一区| 中文资源天堂在线| 91狼人影院| 国产av麻豆久久久久久久| 一级av片app| 99久久无色码亚洲精品果冻| 免费不卡的大黄色大毛片视频在线观看 | 国内精品一区二区在线观看| 看片在线看免费视频| 国产av一区在线观看免费| 一级av片app| av女优亚洲男人天堂| 午夜福利在线观看吧| 天美传媒精品一区二区| 激情 狠狠 欧美| 日韩三级伦理在线观看| 少妇丰满av| 亚洲乱码一区二区免费版| 日日啪夜夜撸| 丰满人妻一区二区三区视频av| 国产不卡一卡二| 亚洲自拍偷在线| 亚洲av中文av极速乱| 91久久精品电影网| 国产精品国产高清国产av| 久久久久国产网址| a级一级毛片免费在线观看| 51国产日韩欧美| 在线观看一区二区三区| 久久久久国产网址| 97超碰精品成人国产| 国产在线男女| 精品久久久噜噜| 国产一级毛片在线| 麻豆精品久久久久久蜜桃| 精品一区二区三区视频在线| 日本一二三区视频观看| 午夜老司机福利剧场| 亚洲av熟女| 精品人妻熟女av久视频| 精品一区二区三区视频在线| 久久午夜亚洲精品久久| 亚洲va在线va天堂va国产| 国国产精品蜜臀av免费| 免费av毛片视频| 深夜a级毛片| 国产三级在线视频| 2022亚洲国产成人精品| 波多野结衣高清无吗| 91午夜精品亚洲一区二区三区| 一区二区三区高清视频在线| 亚洲欧美精品综合久久99| 非洲黑人性xxxx精品又粗又长| 国产精品美女特级片免费视频播放器| 性色avwww在线观看| 99久久无色码亚洲精品果冻| 国产高清有码在线观看视频| 日韩成人av中文字幕在线观看| 免费看美女性在线毛片视频| 一区二区三区免费毛片| 最近手机中文字幕大全| 最近手机中文字幕大全| 毛片女人毛片| 国产老妇伦熟女老妇高清| 99久久精品国产国产毛片| 男插女下体视频免费在线播放| 男插女下体视频免费在线播放| 国产黄色小视频在线观看| 午夜亚洲福利在线播放| 一区二区三区四区激情视频 | 国产精品永久免费网站| 亚洲国产精品久久男人天堂| 99热这里只有精品一区| 别揉我奶头 嗯啊视频| 精品人妻一区二区三区麻豆| 特大巨黑吊av在线直播| 精品无人区乱码1区二区| 韩国av在线不卡| 麻豆乱淫一区二区| 久久人人爽人人片av| 热99re8久久精品国产| 一级黄片播放器| 久久久久性生活片| 国产精品美女特级片免费视频播放器| 国产人妻一区二区三区在| 99久久无色码亚洲精品果冻| 午夜福利在线观看吧| 男插女下体视频免费在线播放| 国产精品嫩草影院av在线观看| 国内精品宾馆在线| 久久韩国三级中文字幕| 给我免费播放毛片高清在线观看| 麻豆乱淫一区二区| 久久精品国产鲁丝片午夜精品| 99久久精品一区二区三区| 特级一级黄色大片| 国产探花极品一区二区| 国产毛片a区久久久久| 夜夜看夜夜爽夜夜摸| 日本黄大片高清| 一进一出抽搐gif免费好疼| 国产精品av视频在线免费观看| 日韩欧美三级三区| 亚洲国产欧美人成| 深爱激情五月婷婷| 免费人成视频x8x8入口观看| 日韩 亚洲 欧美在线| 久久精品久久久久久久性| 欧美成人免费av一区二区三区| 高清在线视频一区二区三区 | 变态另类成人亚洲欧美熟女| 你懂的网址亚洲精品在线观看 | 久久九九热精品免费| 亚洲国产精品成人久久小说 | 91狼人影院| 亚洲,欧美,日韩| 三级男女做爰猛烈吃奶摸视频| 人体艺术视频欧美日本| 我的老师免费观看完整版| 男女做爰动态图高潮gif福利片| 国产精品,欧美在线| 精品久久久久久久久av| 美女xxoo啪啪120秒动态图| 岛国在线免费视频观看| 欧美成人a在线观看| 免费观看的影片在线观看| 午夜爱爱视频在线播放| 亚洲一区高清亚洲精品| 欧洲精品卡2卡3卡4卡5卡区| 国内精品久久久久精免费| 97超碰精品成人国产| 国产精品国产高清国产av| 亚洲电影在线观看av| 亚洲精品亚洲一区二区| 亚洲精品日韩在线中文字幕 | 一边摸一边抽搐一进一小说| 欧美激情在线99| 最近手机中文字幕大全| 午夜爱爱视频在线播放| 嘟嘟电影网在线观看| 桃色一区二区三区在线观看| 国产亚洲av片在线观看秒播厂 | 亚洲国产精品久久男人天堂| 亚洲在线观看片| 久久久欧美国产精品| 免费无遮挡裸体视频| 边亲边吃奶的免费视频| av国产免费在线观看| 国产精品久久久久久精品电影| 晚上一个人看的免费电影| 欧美+日韩+精品| 欧美一区二区精品小视频在线| 国产午夜精品久久久久久一区二区三区| 久久久成人免费电影| 有码 亚洲区| av在线天堂中文字幕| 性色avwww在线观看| 一级黄片播放器| 国产精品久久久久久久久免| 成人美女网站在线观看视频| 色5月婷婷丁香| 欧美不卡视频在线免费观看| 国产精品综合久久久久久久免费| 91av网一区二区| av卡一久久| 99热6这里只有精品| 在线播放国产精品三级| 波多野结衣巨乳人妻| 国语自产精品视频在线第100页| 我的老师免费观看完整版| 日韩中字成人| 亚洲国产欧美人成| 性插视频无遮挡在线免费观看| 成人性生交大片免费视频hd| 成年av动漫网址| 日韩一区二区视频免费看| 国产高清不卡午夜福利| 久久久久久九九精品二区国产| 国产一区二区三区av在线 | or卡值多少钱| 欧美激情在线99| 可以在线观看的亚洲视频| 亚洲国产精品久久男人天堂| 国产成人午夜福利电影在线观看| 一个人看视频在线观看www免费| 不卡一级毛片| 午夜a级毛片| 中国美女看黄片| 桃色一区二区三区在线观看| 久99久视频精品免费| 一卡2卡三卡四卡精品乱码亚洲| 天堂网av新在线| 亚洲在线自拍视频| 国产探花极品一区二区| 国内少妇人妻偷人精品xxx网站| 亚洲真实伦在线观看| 亚洲无线在线观看| 亚洲成人久久爱视频| 日韩一区二区视频免费看| 亚洲欧美精品自产自拍| 我的女老师完整版在线观看| 97热精品久久久久久| 国产乱人偷精品视频| 在线观看66精品国产| 国产精品久久久久久精品电影| 嫩草影院入口| 国产av一区在线观看免费| 波野结衣二区三区在线| 99久久精品热视频| 精品久久久久久久末码| 久久久久久久久久久丰满| 夫妻性生交免费视频一级片| 日韩制服骚丝袜av| 亚洲最大成人手机在线| 麻豆成人av视频| 亚洲一级一片aⅴ在线观看| 亚洲欧美中文字幕日韩二区| 欧美潮喷喷水| 亚洲激情五月婷婷啪啪| 国产视频首页在线观看| 麻豆精品久久久久久蜜桃| av在线亚洲专区| 亚洲精品日韩av片在线观看| av在线播放精品| 欧美高清性xxxxhd video| 三级经典国产精品| 亚洲综合色惰| 国产精华一区二区三区| 天堂网av新在线| 日韩人妻高清精品专区| 亚洲欧洲日产国产| 少妇熟女欧美另类| 国产伦理片在线播放av一区 | 亚洲一级一片aⅴ在线观看| 亚洲最大成人手机在线| 欧美最黄视频在线播放免费| 亚洲激情五月婷婷啪啪| 欧美性猛交╳xxx乱大交人| 国产精品伦人一区二区| 大香蕉久久网| 精品一区二区免费观看| 99久久精品国产国产毛片| 国产探花在线观看一区二区| 亚洲精品成人久久久久久| 欧美成人a在线观看| 级片在线观看| 我的老师免费观看完整版| 色综合站精品国产| 亚洲一区高清亚洲精品| 婷婷亚洲欧美| 亚洲一级一片aⅴ在线观看| 在线免费十八禁| 91精品一卡2卡3卡4卡| 亚洲成人中文字幕在线播放| 亚洲五月天丁香| 国产探花极品一区二区| 亚洲av.av天堂| 国产淫片久久久久久久久| 日韩欧美在线乱码| 91av网一区二区| 国模一区二区三区四区视频| 天堂网av新在线| 国产av在哪里看| 三级毛片av免费| 精品一区二区三区人妻视频| 最近视频中文字幕2019在线8| 日日撸夜夜添| 国产真实乱freesex| 你懂的网址亚洲精品在线观看 | 丰满乱子伦码专区| 啦啦啦观看免费观看视频高清| 我的女老师完整版在线观看| 成熟少妇高潮喷水视频| 日本与韩国留学比较| 国产真实乱freesex| 亚洲欧美日韩东京热| 亚洲av免费在线观看| 亚洲av电影不卡..在线观看| 欧美在线一区亚洲| 欧美精品一区二区大全| 国产精品一区二区三区四区久久| 一级黄片播放器| 看十八女毛片水多多多| 欧美精品国产亚洲| 干丝袜人妻中文字幕| 三级男女做爰猛烈吃奶摸视频| 国产一区二区在线av高清观看| 成人二区视频| 又粗又爽又猛毛片免费看| 久99久视频精品免费| 亚洲最大成人中文| 亚洲精品色激情综合| 午夜激情欧美在线| 成人亚洲欧美一区二区av| 免费人成视频x8x8入口观看| 成人午夜精彩视频在线观看| 国产成人精品婷婷| 一进一出抽搐动态| 国产午夜精品论理片| 99热网站在线观看| 成人av在线播放网站| 两性午夜刺激爽爽歪歪视频在线观看| 免费av毛片视频| 老司机影院成人| 国产午夜精品一二区理论片| 亚洲欧洲国产日韩| 精品人妻熟女av久视频| 看免费成人av毛片| 97超视频在线观看视频| 久久精品久久久久久久性| 人体艺术视频欧美日本| 久久久精品欧美日韩精品| 精品久久久久久久久久久久久| 男人舔奶头视频| 国模一区二区三区四区视频| 中文字幕av在线有码专区| 久久亚洲精品不卡| 欧美xxxx性猛交bbbb| 波野结衣二区三区在线| 亚洲最大成人av| 国产大屁股一区二区在线视频| 久久久午夜欧美精品| 午夜激情欧美在线| 久久精品人妻少妇| 国产人妻一区二区三区在| 欧美一区二区精品小视频在线| 国产黄色视频一区二区在线观看 | 级片在线观看| 日本与韩国留学比较| 亚洲四区av| 国产精品乱码一区二三区的特点| 亚洲精品色激情综合| 九九爱精品视频在线观看| 国产极品天堂在线| 亚洲欧美清纯卡通| 两性午夜刺激爽爽歪歪视频在线观看| 国产亚洲91精品色在线| 九九爱精品视频在线观看| 国产精品久久久久久亚洲av鲁大| 午夜免费激情av| 黄色配什么色好看| 成人特级av手机在线观看| 久久人妻av系列| 久久精品国产自在天天线| 最近的中文字幕免费完整| 全区人妻精品视频| 一区二区三区四区激情视频 | 五月玫瑰六月丁香| 精品99又大又爽又粗少妇毛片| 亚洲av中文av极速乱| 国内精品宾馆在线| 亚洲经典国产精华液单| 深爱激情五月婷婷| 亚洲丝袜综合中文字幕| 色综合亚洲欧美另类图片| 免费看日本二区| 大型黄色视频在线免费观看| 最近手机中文字幕大全| 伊人久久精品亚洲午夜| 有码 亚洲区| 国产又黄又爽又无遮挡在线| 欧美日韩精品成人综合77777| av女优亚洲男人天堂| 国产精品1区2区在线观看.| 国产一区二区三区在线臀色熟女| 乱码一卡2卡4卡精品| 婷婷六月久久综合丁香| 欧美极品一区二区三区四区| 蜜臀久久99精品久久宅男| 在线免费十八禁| 国产精品久久久久久久电影| 免费人成视频x8x8入口观看| 国产伦一二天堂av在线观看| 久久久久久国产a免费观看| 99热这里只有是精品50| 丰满的人妻完整版| 真实男女啪啪啪动态图| 国产精品人妻久久久影院| av在线老鸭窝| www.色视频.com| 精品99又大又爽又粗少妇毛片| 亚洲成人精品中文字幕电影| 午夜激情欧美在线| 国产高潮美女av| 给我免费播放毛片高清在线观看| .国产精品久久| 搡老妇女老女人老熟妇| 国产亚洲精品av在线| 丰满的人妻完整版| 午夜爱爱视频在线播放| 全区人妻精品视频| 天美传媒精品一区二区| 中文字幕人妻熟人妻熟丝袜美| 国产一区二区三区av在线 | 国产极品精品免费视频能看的| 欧美xxxx黑人xx丫x性爽| 成年版毛片免费区| 岛国在线免费视频观看| 国产探花极品一区二区| 免费看美女性在线毛片视频| 在线播放无遮挡| 国产精品爽爽va在线观看网站| av专区在线播放| 国产黄a三级三级三级人| 黄色欧美视频在线观看| 亚洲国产欧美在线一区| 天堂网av新在线| 一边亲一边摸免费视频| 国内精品久久久久精免费| 久久久久久久久久黄片| 真实男女啪啪啪动态图| av免费在线看不卡| 亚洲天堂国产精品一区在线| 久久精品国产清高在天天线| 一本精品99久久精品77| 国产精品一及| 久久久a久久爽久久v久久| 国产私拍福利视频在线观看| 精品人妻一区二区三区麻豆| 久久久成人免费电影| 在线播放国产精品三级| 美女国产视频在线观看| 精品人妻一区二区三区麻豆| 久久久久久伊人网av| 天堂影院成人在线观看| 99久久中文字幕三级久久日本| 乱码一卡2卡4卡精品| 一本—道久久a久久精品蜜桃钙片 精品乱码久久久久久99久播 | 欧美zozozo另类| 亚洲国产高清在线一区二区三| 亚洲av第一区精品v没综合| a级一级毛片免费在线观看| 久久久精品94久久精品| 中文字幕av在线有码专区| 亚洲精品国产成人久久av| kizo精华| 观看美女的网站| 亚洲最大成人中文| 丝袜美腿在线中文| 国产精品一区二区在线观看99 | 99热这里只有精品一区|