• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Boundary-aware texture region segmentation from manga

    2017-06-19 19:20:12XuetingLiuChengzeLiandTienTsinWong
    Computational Visual Media 2017年1期

    Xueting Liu,Chengze Li,and Tien-Tsin Wong()

    Boundary-aware texture region segmentation from manga

    Xueting Liu1,2,Chengze Li1,2,and Tien-Tsin Wong1,2()

    Due to the lack of color in manga (Japanese comics),black-and-white textures are often used to enrich visual experience.With the rising need to digitize manga,segmenting texture regions from manga has become an indispensable basis for almost all manga processing,from vectorization to colorization.Unfortunately,such texture segmentation is not easy since textures in manga are composed of lines and exhibit similar features to structural lines (contour lines).So currently,texture segmentation is still manually performed,which is labor-intensive and time-consuming.To extract a texture region,various texture features have been proposed for measuring texture similarity,but precise boundaries cannot be achieved since boundary pixels exhibit dif f erent features from inner pixels.In this paper,we propose a novel method which also adopts texture features to estimate texture regions.Unlike existing methods,the estimated texture region is only regarded an initial,imprecise texture region.We expand the initial texture region to the precise boundary based on local smoothness via a graph-cut formulation.This allows our method to extract texture regions with precise boundaries.We have applied our method to various manga images and satisfactory results were achieved in all cases.

    manga;texture segmentation

    1 Introduction

    Manga is a world-wide popular form ofentertainment enjoyed by people of all ages (see Fig.1).Nowadays,with the development of electronic devices,more and more people read manga on electronic devices such as computers, tablets,and even cellphones.With the power of electronic devices,manga can be presented in a visually enriched form by adding color,motion,or stereoscopic ef f ects.There is thus a rising trend in the manga industry to convert legacy manga books into digital versions.During the digitization process, one major challenge is to segment texture regions from manga as a basis for various applications such as vectorization and colorization.However,texture region segmentation is not easy for manga since textures in manga are composed of lines.This leads to the diffi culty of discriminating structural lines from textures,and further leads to the diffi culty of identifying the precise boundary of each texture region.Therefore,texture region segmentation is still manually performed in the manga industry currently.As one may imagine,this process is quite tedious and time-consuming.

    Fig.1 Manga images.

    To help identify and classify textures,various texture features have been proposed in the computer vision f i eld[1].The similarity of the texture features for two pixels shows whether these two pixels are inside the same texture region.Texture segmentationor texture classif i cation techniques can be further applied to extract texture regions based on similarity of texture features.However,since texture features are analyzed within a local neighborhood,pixels in the interior of a texture region(e.g.,the blue box in Fig.2)and pixels near the boundary of a texture region(e.g.,the red and orange boxes in Fig.2)exhibit dif f erent texture features(see Fig.2(c)),even though they belong to the same texture region.Thus,pixels near texture boundaries may be mistakenly regarded as not belonging to that texture region.This will lead to imprecise boundaries of the segmented texture region if only similarity of texture features is considered(see Fig.3).

    To resolve the boundary issue,we noticed that texture smoothing techniques are quite powerful in suppressing local textures while still preserving sharp boundaries.Using texture smoothing methods, one could suggest f i rst smoothing the input manga image,and then performing intensity-based image segmentation to extract texture regions. However,a texture region may have spatial varying textures(e.g.,the background region in Fig.4(a) changes from dark to light vertically).Furthermore, the texture smoothing method is also incapable of dif f erentiating textures with similar overall intensities(as in Fig.4(b)).Therefore,we still need to analyze texture features in order to robustly handle spatial-varying textures and textures with similar intensities.

    Fig.3 Texture region segmentation based on Gabor features.Note that a precise boundary cannot be achieved.

    Fig.4(a)A background region with a spatial-varying texture.(b) Dif f erent textures may have similar intensities after smoothing.

    In this paper,we propose a novel,user-interactionbased texture region segmentation system,which integrates texture feature analysis and texture smoothing techniques in order to segment texture regions with precise boundaries from manga images. The user may draw one or several strokes inside the region or object to be segmented.Our system automatically estimates a region mask from the user input.To do so,we f i rst summarize the texture features of the user-drawn strokes,and estimate an initial texture region having similar texture features to the user-drawn strokes.In this step,we adopt a conservative similarity measurement for estimating the initial texture regions in order to guarantee that all pixels inside the initial texture region lie within the user-specif i ed region.We then expand the initial texture region to the precise boundaries using a graph-cut formulation.We formulate the accumulated smoothness dif f erence from the initial texture region as the data cost,and the local smoothness dif f erence as the smoothness cost.Using this formulation,we can extract the user-specif i ed region with a precise boundary.

    We demonstrate the ef f ectiveness of our texture region segmentation system on a set of manga images containing various types of textures, including regular textures,irregular textures,and spatial-varying textures.Our contributions can be summarized as follows:

    ?We propose a novel user-interaction-based system for extracting texture regions with precise boundaries.

    ?Our system can handle user strokes drawn across multiple dif f erent textures simultaneously.

    2 Related work

    While only a few existing research work are tailored for extracting texture regions from manga, extensive research has been done on identifying and segmenting textures from natural photos.We can roughly classify this related research into three categories:feature-based texture segmentation, regular texture analysis,and texture smoothing.

    Feature-based texture segmentation.Texture features are based on statistical models describing local characteristics of point neighborhoods.The statistical model of a texture feature usually contains a range of texture properties,such as size,aspect ratio,orientation, brightness,and density[3].Various texture features have been proposed to describe and model textures including various Gabor f i lters[4],f i lter bank responses[5],random f i eld models,wavelet representations,and so on.In particular,the Gabor f i lter was adopted by Ref.[6]to analyze textures in manga,and is still considered to be the state-of-the-art method.Texture features are utilized in various applications such as texture classif i cation,segmentation,and synthesis.Texture segmentation methods fall into two categories, supervised[7]and unsupervised[6,8].In particular, Ref.[6]proposed to segment textures in manga images via a level-set method based on Gabor features.However,as we have stated,although feature-based texture segmentation methods can identify textures well and dif f erentiate between textures,precise region boundaries cannot be achieved since boundary pixels exhibit dif f erent texture features from interior pixels.In contrast, our method can extract texture regions with precise boundaries.

    Regular texture analysis.For regular or nearregular texture patterns,attempts have been made to detect and analyze the regularity of the textures based on spatial relationships[9–11].In particular, Liuy et al.[12]considered how to detect and remove fences in natural images.A stream of research has also considered de-screening,i.e.,detecting and smoothing halftone textures.An in depth survey of de-screening can be found in Ref.[13].Kopf and Lischinski[14]discussed how to extract halftone patterns in printed color comics by modeling dot patterns.Very recently,Yao et al.[15]considered how to extract textures from manga by modeling three specif i c texture primitives,dots,stripes,and grids.However,these methods can only handle a small set of pre-def i ned regular textures.In comparison,our method can handle regular or irregular,and even spatial-varying,textures of the kind that exist in real manga images.

    Texture smoothing.In order to dif f erentiate textures and structures,various edge-preserving texture smoothing methods have been proposed, such as the total variation regularizer[16–19], bilateral f i ltering[20–22],local histogram-based f i ltering[23],weighted least squares[24],extrema extraction and extrapolation[25],L0gradient optimization[26],and relative total variation (RTV)[2].While these texture smoothing methods may suppress local oscillations based on local information,they are incapable of identifying a texture region or dif f erentiating between two textures.This is because that these methods do not model textures or structures,so they do not have a higher-level understanding of the semantics of the textures.In this paper,we thus utilize texture features to identify textures,but we incorporate texture smoothing techniques to identify sharp texture boundaries.

    3 Overview

    The input to our system includes a manga image (see Fig.5(a))and one or more user-specif i ed strokes (see Fig.5(b)).To extract regions with similar textures to the ones identif i ed by the user-specif i ed strokes,we f i rst summarize the texture features of the pixels belonging to the strokes.The texture features we use are Gabor features,which are also used by Ref.[6]in a manga colorization application. Since textures may spatially vary,and one userspecif i ed stroke may go across several texture regions (see Fig.5(b)),we summarize several main texture features inside each stroke by clustering.Then wecalculate the similarity between the texture feature of each pixel in the manga image and the clustered main texture features,to form a texture similarity map(see Fig.5(c)).In this map,intensity of pixel values indicates similarity of texture feature values with those of the user-specif i ed strokes.Based on the computed texture similarities,we then obtain one or several initial texture regions using a graphcut formulation(see Fig.5(d)).Our initial texture region extraction method is detailed in Section 4.

    Fig.5 System overview(image size:778×764,loading time:3.51 s,processing time:0.58 s).

    We then expand the initial texture regions to their precise boundaries.To do so,we f i rst obtain a smoothed image from the input image using texture smoothing techniques(see Fig.5(e)).Amongst all existing texture smoothing techniques,we found that the RTV metric proposed by Ref.[2]is the most ef f ective at smoothing textures while preserving sharp structures in manga images.In the smoothed image,if two neighboring pixels have close intensity values,it means that these two pixels are very likely to be inside the same texture region.Conversely, if two neighboring pixels have a jump in intensity values,it means that these two pixels are very likely to be inside two dif f erent regions.Therefore,we can dif f use the initial texture regions using local intensity continuity of the smoothed image to obtain a dif f usion map(see Fig.5(f)).This dif f usion map shows how smoothly each pixel is connected to the initial regions.If a pixel has a low dif f usion value, it means that this pixel is smoothly connected to the initial region,so it is very likely that this pixel is inside the same texture region.Conversely,a pixel with a high dif f usion value is very likely to lie outside the texture region.Finally,we extract the precise texture region based on the dif f usion map using another graph-cut formulation(see Figs.5(g) and 5(h)).Our region expansion method is detailed in Section 5.

    We have evaluated our system with various manga,and the results are shown in Section 6.We also show how users can easily adjust the retrieved texture region using a single parameter.

    4 Initial region extraction

    Given an input manga image,and one or more userspecif i ed strokes,we f i rst extract the initial regions with similar textures to the user-specif i ed strokes. To do so,we f i rst summarize the texture features of the pixels inside the strokes.Then we obtain a texture similarity map which gives the texture similarity between each pixel in the image and the summarized texture features.The initial regions are then extracted using a graph-cut formulation.

    4.1 Texture feature summarization

    To judge whether two pixels have similar textures, we use statistical features in the Gabor wavelet domain[27],which have already proved useful in dif f erentiating textures in manga[6].A Gabor feature vector is an M×N-dimensional vector where M is the number of scales and N is the number of orientations used in the Gabor feature analysis. In this paper,we f i x the numbers of scales and orientations to M=4 and N=6 respectively in all our experiments.Therefore,for a pixel p in themanga image,its Gabor feature vector Gpdescribing local texture feature around p is 24-dimensional.

    Given a set of user-specif i ed strokes U={u1,u2, ...},we could calculate a main texture feature for these strokes by averaging the texture features of all pixels inside the strokes aswhere |U|is the cardinality of U.However,a textured area may have a spatial-varying texture,or a single user-specif i ed stroke may also go across multiple textured areas at the same time.For example,the moon in Fig.5 cannot be specif i ed by a single texture feature vector.Therefore,we represent the textures determined by the user-specif i ed strokes using multiple texture feature vectors.To extract the most representative textures for the user-specif i ed strokes,we use the k-means clustering method to cluster the texture features of all pixels inside the strokes into k groups as TU={T1,...,Tk}.These satisfy:

    4.2 Initial region extraction via graph-cut

    From the summarized representative texture feature vectors of the user-specif i ed strokes,we then calculate the texture similarity value between each pixel p and the representative textures TUas

    Pixels with higher texture similarity are more likely to be inside the texture region specif i ed by the user.

    Using the calculated texture similarity,we extract an initial texture region via a graph-cut formulation. There are two terminal nodes,the source and the sink,in our graph.Each pixel p in the image corresponds to a non-terminal node np,which is connected to both source and sink.If the graphcut result connects a non-terminal node(pixel)to the source,it means this pixel lies inside the initial regions.Otherwise,this pixel lies outside the initial regions.Each edge connecting a non-terminal node and a terminal node is associated with a data cost:

    For every pair of(4-connected)neighboring pixels p and q in the image,we connect npand nqby an edge.Each edge connecting two non-terminal nodes is associated with a smoothness cost which measures our conf i dence that these two neighboring pixels should be assigned the same label,and therefore belong to the same texture region.We model the smoothness cost as the magnitude of the dif f erence of the texture feature vectors of these two nodes npand nq:

    Intuitively speaking,if two neighboring pixels p and q have similar textures,the smoothness cost S(np,nq) should be low,and there is high probability for them to be assigned the same label,and so belong to the same texture region.

    After constructing the graph,we can obtain the optimal cut through an optimization process which minimizes the energy function:

    where u∈{source,sink}is the label,and wcweights the data cost and smoothness cost.We experimentally set wcto 1 in all our experiments. The pixels labeled as source after graph-cut form the initial regions.Since other regions might also have similar patterns to the user-specif i ed region, we remove regions that do not intersect the userspecif i ed strokes.For regions that contain spatialvarying textures,dif f erent strokes may lead to dif f erent initial regions(see Fig.6)and af f ect the followed expansion.The user-specif i ed strokes should go across all dif f erent textures in order to achieve good segmentation results.

    5 Initial region expansion

    Fig.6 Initial region extraction from strokes.Top:several userspecif i ed strokes.Bottom:corresponding extracted initial regions.

    Starting from the extracted initial regions,we now show how we expand the regions to their precise boundaries.In short,we f i rst smooth the input image and dif f use the initial regions based on the smoothed image.Then we extract the f i nal texture region with precise boundary via another graph-cut formulation.

    5.1 Smoothness map construction

    To smooth a manga image,we have experimentally determined that the relative total variation method proposed by Xu et al.[2]performs best among the large pool of existing methods.This is mainly because that this measurement is more tolerant of high-contrast textures than other methods.This method has two input parameters λ and σ.Here,λ is a weighting factor which we set to 0.015 in all our experiments.σ is the local window size for measuring local oscillations(textures).In our experiments,we found that σ=5 works best for most textures.But when the texture is sparser,we may need to assign σ a higher value.Some texture smoothing results of manga images are shown in Figs.4 and 5(e).

    In the smoothed image,if two neighboring pixels have similar intensity values,it is very likely that they are in the same texture region,and vice versa. We also observe that texture regions in manga images are usually enclosed by black boundary lines. Therefore,we can judge whether a pixel is likely to be inside the user-specif i ed region by measuring whether this pixel is smoothly connected to the initial regions.Here,by saying smoothly connected,we mean that there exists a path from this pixel to the initial regions where the intensity values of the pixels change smoothly along the path.Formally,given the initial regions R and a pixel p outside the initial region,we def i ne a path from R to p as a set of pixels h(R,p)={q1,...,ql}where,q1∈R and qL=p.Here,is the L1-norm operator.We can measure whether p is smoothly connected to R along a path h(R,p)by accumulating the intensity dif f erences along this path:

    where J is the smoothed manga image.Since there is more than one path from p to R,we can measure whether p is smoothly connected to R by taking the minimal smoothness value of all possible paths:

    In a practical implementation,we compute the above smoothness values via a dif f usion process.More concretely,we f i rst construct a dif f usion map F by setting pixels inside the initial regions to 0 and pixels outside initial regions to+∞.Then we iteratively update the smoothness value of each pixel based on its surrounding pixels using:

    We visualize the dif f usion process in Fig.7.

    5.2 Final region extraction via graph-cut

    While we could extract a f i nal texture region by thresholding the dif f usion map,we have found that naive thresholding generally leads to bumpy and leaky boundaries.To avoid these issues,we formulate another graph to extract the f i nal region. As in the previous graph-cut formulation,each pixel p is formulated as a non-terminal node np,and is connected to two terminal nodes,the source and the sink.If the graph-cut result labels a non-terminal node(pixel)as connected to the source,it means this pixel is inside the f i nal texture region;otherwise,it is outside.The data cost associated with each edge connecting a terminal node and a non-terminal node measures how likely this pixel is smoothly connected to the initial region based on the smoothness map, and is expressed as

    Fig.7 The dif f usion process.

    where σsis empirically set to 0.05 in all our experiments.Intuitively speaking,if a pixel p has low smoothness value Fp,(np,source)should be relatively high,and there is a high probability that npwill be connected to the source.Similarly,for every pair of(4-connected)neighboring pixels p and q in the image,we connect npand nqwith an edge.The smoothness cost associated with each edge connecting two non-terminal nodes measures how likely the two neighboring pixels are to have the same label.We connect the nodes for every pair of nieghboring pixels p and q by an edge,whose associated cost is

    Intuitively,if the intensity values of two neighboring pixels are similar in the smoothed image,there is a high probability that they are in the same texture region.Finally,we solve this graph-cut problem by minimizing the following energy function:

    where u∈{source,sink}is the label,and wvweights data and smoothness costs.We empirically set wvto 0.25 in all our experiments.After graph-cut,pixels assigned to source form the f i nal texture region.

    5.3 User control

    Given a set of user-specif i ed strokes,while our system quite stably extracts texture regions based on a set of pre-def i ned parameters,we also allow user control. We let the user control the f i nal region by adjusting a single parameter z∈[?1,1].The smaller z,the smaller the f i nal texture region will be,and vice versa.We achieve this by incorporating z in the graph-cut formulation;in particular,the data cost is re-def i ned as

    where P(v,c1,c2)is a piecewise function def i ned as

    If z is set to?1,all pixels are labeled as sink and the extracted region is empty.If z is set to 1,all pixels are labeled as source and the extracted region is the whole image.The default value of z is 0.We show an example of parameter tuning in Fig.8.Even though user can control the extracted region with this parameter,our method is quite stable.In fact, the extracted region is constant for z∈[?0.6,0.6]in this case.

    6 Results and discussion

    6.1 Validation

    To validate the ef f ectiveness of our method,we have applied our methods to manga images with a variety of texture patterns,including regular patterns(e.g., as in Fig.9),near-regular patterns(e.g.,as in Fig.10),and irregular patterns(e.g.,as in Figs.5 and 11).We also compare our results with two stateof-the-art methods tailored for manga colorization[6] and manga vectorization[15]respectively.

    Figure 9(a)shows a manga image of a cat with a dot pattern.While the feature-based method[6]failed to detect precise boundaries of the texture regions(see Fig.9(b)),the primitivebased method[15]correctly detects the regions by formulating a specif i c dot pattern model(see Fig.9(c)).However,Yao et al.’s method makes very strong assumptions about the primitives in the textures,so they can only handle well textures such as dots,stripes,and grids.In comparison,we make no assumption about the primitives in the textures, but our method can still achieve similar results to those in Ref.[15]by use of texture feature analysis and smoothness dif f usion(see Fig.9(d)).Figure 10 shows another comparison between the results of Ref.[6]and our method.While both their method and ours analyze the texture of the user-specif i edstroke,their method only uses a single texture feature to represent the whole stroke.Therefore, their method is incapable of f i nding texture regions that are spatial-varying(see Fig.10(b)).In contrast, our method can handle spatial-varying textures well (see Fig.10(c)).

    Fig.8 User control of the extracted region via a single parameter z.

    Fig.10‘Boy’(1712×907,loading time:9.72 s,processing time:2.07 s).

    Fig.11‘Cars’(762×843,loading time:4.03 s,processing time:1.38 s).

    In Figs.5 and 11 the user specif i es texture regions with large spatial variation,especially in Fig.11. By using a set of texture feature vectors to represent the user-specif i ed strokes,our method successfully extracts the texture regions with precise boundaries. Since a manga image may contain multiple textures, we also allow user to specify several sets of strokes indicating dif f erent texture regions.The userspecif i ed strokes are sequentially processed so that the user can control the extracted regions more easily.We show three examples in Figs.12–14 where each input image contains multiple dif f erent textures including solid-color regions,regular texture regions, and irregular texture regions.Our method achieves good results in all cases.

    6.2 Timing statistics

    Fig.12‘Poker’(838×1210,loading time:7.05 s,processing time: 4.72 s).

    Fig.13‘Basketball’(1251×1013,loading time:7.97 s,processing time:6.27 s).

    Fig.14‘Astonished’(1251×1013,loading time:7.10 s,processing time:5.36 s).

    All of our experiments were conducted on a PC with a 2.7 GHz CPU and 64 GB memory;all tests were single threaded,unoptimized code,and no GPU was used.We break down the computational time for each example into two parts,the loading time and the processing time(given in the caption of each f i gure).The loading time is the time spent immediately when the user loads an image into the system,and can be regarded as the offl ine computation time.The reaction time is the time spent when the system returns the extracted region after the strokes have been drawn,and can be regarded as the online computation time.Whenever the user draws a new stroke or adjusts the control parameter,only the online parts need to be reexecuted.We observe that total computation time depends strongly on the resolution of the input image.

    6.3 Limitations

    One of our limitations concerns the latent assumption that the boundaries between two regions are sharp and smooth.Currently,we cannot handle blurred boundaries well.Furthermore,if the boundary of a region is quite spiky(e.g.,the shock balloons in Fig.14(a)),our current graph-cut formulation will result in a smoothed boundary (e.g.,blue regions in Fig.14(b)).Our method also cannot separate neighboring regions if they are visually inseparable.For example,in Fig.14(a), the black boundary of the top left shock balloon is connected to the black background of the second panel.Furthermore,the boundary in the smoothed image may deviate by one or two pixels from the original boundary due to limitations of the texture smoothing technique.In this case,we may also fail to extract the precise boundaries of the texture regions.

    7 Conclusions

    In this paper,we have proposed a novel system to extract texture regions with precise boundaries. Our method starts from an input image and a set of user-specif i ed strokes,and extracts initial regions containing pixels with similar textures, using Gabor wavelets.However,texture features, such as Gabor wavelets,cannot provide precise boundaries.We further smooth the original image via a texture smoothing technique,and ref i ne the initial regions based on the smoothed image.Our method outperforms existing methods in extracting precise boundaries especially for spatial-varying textures.

    While our method currently assumes hard boundaries,we could adopt matting techniques instead of the current graph-cut formulation to restore regions with alpha values.We also note that identif i cation of regions depends highly on the semantics of the image content,and introducing perception-based edge extraction techniques could help extract more precise boundaries.

    Acknowledgements

    This project was supported by the National Natural Science Foundation of China(Project No.61272293),and Research Grants Council of the Hong Kong Special Administrative Region under RGC General Research Fund(Project Nos.CUHK14200915 and CUHK14217516).

    [1]Tuytelaars,T.;Mikolajczyk,K.Local invariant feature detectors:A survey.Foundations and Trends in Computer Graphics and Vision Vol.3,No.3,177–280, 2008.

    [2]Xu,L.;Yan,Q.;Xia,Y.;Jia,J.Structure extraction from texture via relative total variation. ACM Transactions on Graphics Vol.31,No.6,Article No.139,2012.

    [3]Julesz,B.Textons,the elements of texture perception, and their interactions.Nature Vol.290,91–97,1981.

    [4]Weldon,T.P.;Higgins,W.E.;Dunn,D.F.Effi cient Gabor f i lter design for texture segmentation.Pattern Recognition Vol.29,No.12,2005–2015,1996.

    [5]Varma,M.;Zisserman,A.Texture classif i cation: Are f i lter banks necessary?In:Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition,Vol.2,II-691-8,2003.

    [6]Qu,Y.;Wong,T.-T.;Heng,P.-A.Manga colorization. ACM Transactions on Graphics Vol.25,No.3,1214–1220,2006.

    [7]Hofmann,T.;Puzicha,J.;Buhmann,J.M. Unsupervised texture segmentation in a deterministic annealing framework.IEEE Transactions on Pattern Analysis and Machine Intelligence Vol.20,No.8,803–818,1998.

    [8]Paragios,N.;Deriche,R.Geodesic active regions for supervised texture segmentation.In:Proceedings of the 7th IEEE International Conference on Computer Vision,Vol.2,926–932,1999.

    [9]Hays,J.;Leordeanu,M.;Efros,A.A.;Liu, Y.Discovering texture regularity as a higher-order correspondence problem.In:Computer Vision–ECCV 2006.Leonardis,A.;Bischof,H.;Pinz,A.Eds. Springer Berlin Heidelberg,522–535,2006.

    [10]Liu,Y.;Collins,R.T.;Tsin,Y.A computational model for periodic pattern perception based on frieze and wallpaper groups.IEEE Transactions on Pattern Analysis and Machine Intelligence Vol.26,No.3,354–371,2004.

    [11]Liu,Y.;Lin,W.-C.;Hays,J.Near-regular texture analysis and manipulation.ACM Transactions on Graphics Vol.23,No.3,368–376,2004.

    [12]Liuy,Y.;Belkina,T.;Hays,J.H.;Lublinerman,R. Image de-fencing.In:Proceedings of IEEE Conference on Computer Vision and Pattern Recognition,1–8, 2008.

    [13]Siddiqui,H.;Boutin,M.;Bouman,C.A.Hardwarefriendly descreening.IEEE Transactions on Image Processing Vol.19,No.3,746–757,2010.

    [14]Kopf,J.;Lischinski,D.Digital reconstruction of halftoned color comics.ACM Transactions on Graphics Vol.31,No.6,Article No.140,2012.

    [15]Yao,C.-Y.;Hung,S.-H.;Li,G.-W.;Chen,I.-Y.; Adhitya,R.;Lai,Y.-C.Manga vectorization and manipulation with procedural simple screentone.IEEE Transactions on Visualization and Computer Graphics Vol.23,No.2,1070–1084,2017.

    [16]Aujol,J.-F.;Gilboa,G.;Chan,T.;Osher,S.Structuretexture image decomposition—Modeling,algorithms, and parameter selection.International Journal of Computer Vision Vol.67,No.1,111–136,2006.

    [17]Meyer,Y.Oscillating Patterns in Image Processing and Nonlinear Evolution Equations:The Fifteenth Dean Jacqueline B.Lewis Memorial Lectures. American Mathematical Society,2001.

    [18]Rudin,L.I.;Osher,S.;Fatemi,E.Nonlinear total variation based noise removal algorithms.Physica D: Nonlinear Phenomena Vol.60,Nos.1–4,259–268, 1992.

    [19]Yin,W.;Goldfarb,D.;Osher,S.Image cartoontexture decomposition and feature selection using the total variation regularized L1functional.In: Variational,Geometric,and Level Set Methods in Computer Vision.Paragios,N.;Faugeras,O.;Chan, T.;Schn¨orr,C.Eds.Springer Berlin Heidelberg,73–84,2005.

    [20]Durand,F.;Dorsey,J.Fast bilateral f i ltering for the display of high-dynamic-range images.ACM Transactions on Graphics Vol.21,No.3,257–266, 2002.

    [21]Fattal,R.;Agrawala,M.;Rusinkiewicz,S.Multiscale shape and detail enhancement from multi-light image collections.ACM Transactions on Graphics Vol.26, No.3,Article No.51,2007.

    [22]Paris,S.;Durand,F.A fast approximation of the bilateral f i lter using a signal processing approach.In: Computer Vision–ECCV 2006.Leonardis,A.;Bischof, H.;Pinz,A.Eds.Springer Berlin Heidelberg,568–580, 2006.

    [23]Kass,M.;Solomon,J.Smoothed local histogram f i lters.ACM Transactions on Graphics Vol.29,No. 4,Article No.100,2010.

    [24]Farbman,Z.;Fattal,R.;Lischinski,D.;Szeliski, R.Edge-preserving decompositions for multi-scale tone and detail manipulation.ACM Transactions on Graphics Vol.27,No.3,Article No.67,2008.

    [25]Subr,K.;Soler,C.;Durand,F.Edge-preserving multiscale image decomposition based on local extrema.ACM Transactions on Graphics Vol.28,No. 5,Article No.147,2009.

    [26]Xu,L.;Lu,C.;Xu,Y.;Jia,J.Image smoothing via L0gradient minimization.ACM Transactions on Graphics Vol.30,No.6,Article No.174,2011.

    [27]Manjunath,B.S.;Ma,W.-Y.Texture features for browsing and retrieval of image data.IEEE Transactions on Pattern Analysis and Machine Intelligence Vol.18,No.8,837–842,1996.

    Chengze Lireceived his B.S.degree from University of Science and Technology of China in 2013.He is currently a Ph.D.student in the Department of Computer Science and Engineering,the Chinese University of Hong Kong.His research interests include computer vision,pattern recognition,and high-performance computing.

    Tien-Tsin Wongreceived his B.Sc., M.Phil.,and Ph.D.degrees in computer science from the Chinese University of Hong Kong in 1992,1994,and 1998,respectively.He is currently a professor in the Department of Computer Science and Engineering, the Chinese University of Hong Kong. His main research interests include computer graphics, computational manga,precomputed lighting,imagebased rendering,GPU techniques,medical visualization, multimedia compression,and computer vision.He received the IEEE Transactions on Multimedia Prize Paper Award 2005 and the Young Researcher Award 2004.

    Open AccessThe articles published in this journal are distributed under the terms of the Creative Commons Attribution 4.0 International License(http:// creativecommons.org/licenses/by/4.0/),which permits unrestricted use,distribution,and reproduction in any medium,provided you give appropriate credit to the original author(s)and the source,provide a link to the Creative Commons license,and indicate if changes were made.

    Other papers from this open access journal are available free of charge from http://www.springer.com/journal/41095. To submit a manuscript,please go to https://www. editorialmanager.com/cvmj.

    iu

    her B.Eng. degree from Tsinghua University and Ph.D.degree from the Chinese University of Hong Kong in 2009 and 2014,respectively.She is currently a postdoctoral research fellow in the Department of Computer Science and Engineering,the Chinese University of Hong Kong.Her research interests include computer graphics,computer vision,computational manga and anime,and non-photorealistic rendering.

    1 The Chinese University of Hong Kong,Hong Kong, China.E-mail:X.Liu,xtliu@cse.cuhk.edu.hk;C.Li, czli@cse.cuhk.edu.hk;T.-T.Wong,ttwong@cse.cuhk. edu.hk().

    2 Shenzhen Research Institute,the Chinese University of Hong Kong,Shenzhen,China.

    Manuscript received:2016-09-09;accepted:2016-12-20

    黄片无遮挡物在线观看| 日本午夜av视频| 一个人看视频在线观看www免费| 精品一区在线观看国产| 一区在线观看完整版| 欧美精品一区二区大全| 女人久久www免费人成看片| 国产一区有黄有色的免费视频| 久久综合国产亚洲精品| 精品国产国语对白av| 插阴视频在线观看视频| 国精品久久久久久国模美| 啦啦啦视频在线资源免费观看| 在线天堂最新版资源| 午夜福利视频在线观看免费| 欧美精品亚洲一区二区| 国产精品成人在线| 日韩中字成人| 丝瓜视频免费看黄片| 久久人人爽av亚洲精品天堂| 有码 亚洲区| 亚洲精品乱久久久久久| 国产精品无大码| 免费人妻精品一区二区三区视频| 亚洲精品,欧美精品| 18在线观看网站| 哪个播放器可以免费观看大片| 汤姆久久久久久久影院中文字幕| 大话2 男鬼变身卡| 人成视频在线观看免费观看| 十八禁网站网址无遮挡| 亚洲av在线观看美女高潮| 国产爽快片一区二区三区| 嫩草影院入口| 我的女老师完整版在线观看| 亚洲国产精品专区欧美| 婷婷色麻豆天堂久久| 免费高清在线观看日韩| 国产又色又爽无遮挡免| av在线app专区| 中国美白少妇内射xxxbb| 精品一区二区三卡| 国产精品人妻久久久久久| av黄色大香蕉| 在线免费观看不下载黄p国产| 一本久久精品| 一区二区三区四区激情视频| 啦啦啦视频在线资源免费观看| 国产精品久久久久久av不卡| 亚洲婷婷狠狠爱综合网| 9色porny在线观看| 七月丁香在线播放| 亚洲伊人久久精品综合| 九九久久精品国产亚洲av麻豆| 如日韩欧美国产精品一区二区三区 | 如何舔出高潮| 中文乱码字字幕精品一区二区三区| 成人国语在线视频| 亚洲综合色网址| av免费在线看不卡| 啦啦啦在线观看免费高清www| 一本—道久久a久久精品蜜桃钙片| 亚洲一级一片aⅴ在线观看| 国产精品人妻久久久影院| 国产伦精品一区二区三区视频9| 十分钟在线观看高清视频www| 少妇人妻 视频| 国产欧美日韩综合在线一区二区| 女人精品久久久久毛片| 日韩不卡一区二区三区视频在线| 搡女人真爽免费视频火全软件| 十分钟在线观看高清视频www| 亚洲国产av影院在线观看| 国产精品国产三级国产av玫瑰| 最新中文字幕久久久久| 少妇精品久久久久久久| 欧美亚洲 丝袜 人妻 在线| 日韩精品免费视频一区二区三区 | 精品一区二区免费观看| 国产一区二区三区av在线| 久久99一区二区三区| 免费av不卡在线播放| 精品久久久噜噜| 少妇猛男粗大的猛烈进出视频| av国产精品久久久久影院| 五月玫瑰六月丁香| 亚洲精品aⅴ在线观看| 欧美xxxx性猛交bbbb| 久久久久国产精品人妻一区二区| 国产爽快片一区二区三区| 国产欧美日韩一区二区三区在线 | 乱码一卡2卡4卡精品| 亚洲第一av免费看| 日韩精品免费视频一区二区三区 | 欧美日韩av久久| 精品国产国语对白av| 在线观看免费高清a一片| 午夜91福利影院| av女优亚洲男人天堂| 婷婷色av中文字幕| 亚洲精品乱码久久久v下载方式| 亚洲av在线观看美女高潮| 丝袜喷水一区| 黄色一级大片看看| 欧美xxⅹ黑人| 视频在线观看一区二区三区| 日本色播在线视频| 黑人高潮一二区| 亚洲精品色激情综合| 高清av免费在线| 免费av中文字幕在线| 少妇猛男粗大的猛烈进出视频| 99久国产av精品国产电影| 免费黄色在线免费观看| 丝瓜视频免费看黄片| 少妇被粗大的猛进出69影院 | 国产成人一区二区在线| 我的老师免费观看完整版| 日韩大片免费观看网站| 国产精品免费大片| 天堂8中文在线网| 五月天丁香电影| 一级黄片播放器| 亚洲美女搞黄在线观看| tube8黄色片| 一区二区日韩欧美中文字幕 | 三上悠亚av全集在线观看| 欧美精品高潮呻吟av久久| 国产成人freesex在线| 免费高清在线观看日韩| 久久热精品热| 99热国产这里只有精品6| 久久综合国产亚洲精品| 色婷婷久久久亚洲欧美| 国产片特级美女逼逼视频| 下体分泌物呈黄色| 亚洲精品一区蜜桃| 免费看不卡的av| 狠狠婷婷综合久久久久久88av| 3wmmmm亚洲av在线观看| 青春草亚洲视频在线观看| 成人无遮挡网站| 国产视频内射| 日韩成人伦理影院| 我的老师免费观看完整版| 美女国产视频在线观看| √禁漫天堂资源中文www| 黄片播放在线免费| 亚洲成人一二三区av| 亚洲人成网站在线观看播放| 99国产综合亚洲精品| 最新的欧美精品一区二区| 日韩人妻高清精品专区| 人人妻人人澡人人看| 国产免费视频播放在线视频| 性色av一级| 国产一区有黄有色的免费视频| 18+在线观看网站| 久久久久久久久久久免费av| 寂寞人妻少妇视频99o| 精品久久久久久久久亚洲| 久久久久久伊人网av| 黑人欧美特级aaaaaa片| 久久精品久久精品一区二区三区| 午夜福利视频精品| 久久精品夜色国产| 又黄又爽又刺激的免费视频.| 国产av精品麻豆| 国产一区二区在线观看av| 久久国产亚洲av麻豆专区| 中文字幕制服av| 黄色配什么色好看| 国产精品一国产av| 狂野欧美白嫩少妇大欣赏| 精品亚洲成a人片在线观看| 99re6热这里在线精品视频| 日韩一本色道免费dvd| 中国三级夫妇交换| 99九九在线精品视频| 狠狠精品人妻久久久久久综合| 色5月婷婷丁香| 亚洲一区二区三区欧美精品| 男女国产视频网站| 欧美xxxx性猛交bbbb| 赤兔流量卡办理| 亚洲精品日韩在线中文字幕| 狂野欧美激情性xxxx在线观看| 51国产日韩欧美| 亚洲国产毛片av蜜桃av| 成年人免费黄色播放视频| 国产成人精品无人区| 精品午夜福利在线看| 美女内射精品一级片tv| 亚洲国产欧美日韩在线播放| 亚洲欧美成人精品一区二区| 精品国产一区二区三区久久久樱花| 汤姆久久久久久久影院中文字幕| 国产成人免费观看mmmm| 国产成人免费观看mmmm| 精品久久久精品久久久| 老女人水多毛片| 一个人看视频在线观看www免费| 色网站视频免费| 国产av国产精品国产| 国产一区二区在线观看日韩| 欧美另类一区| 成人无遮挡网站| 99九九在线精品视频| 亚洲精品乱码久久久久久按摩| 久久免费观看电影| 日本vs欧美在线观看视频| 亚洲av电影在线观看一区二区三区| 丰满饥渴人妻一区二区三| 亚洲欧美一区二区三区国产| 搡老乐熟女国产| 国产av国产精品国产| 成人二区视频| 国产又色又爽无遮挡免| 国产av精品麻豆| 男人操女人黄网站| 久久久久人妻精品一区果冻| 99热6这里只有精品| 欧美变态另类bdsm刘玥| 国产精品成人在线| 国产综合精华液| 人妻系列 视频| 一级毛片aaaaaa免费看小| 全区人妻精品视频| 精品国产乱码久久久久久小说| 久久午夜综合久久蜜桃| 简卡轻食公司| 在线观看免费视频网站a站| 中文字幕人妻丝袜制服| 亚洲av.av天堂| 大香蕉97超碰在线| 久久久久久久久久久丰满| 岛国毛片在线播放| a级毛片在线看网站| 精品久久久久久久久av| 51国产日韩欧美| 欧美一级a爱片免费观看看| 免费黄频网站在线观看国产| 考比视频在线观看| 欧美xxⅹ黑人| 九色成人免费人妻av| 大香蕉97超碰在线| 在线观看免费视频网站a站| 亚洲激情五月婷婷啪啪| 丰满迷人的少妇在线观看| 婷婷成人精品国产| 久久鲁丝午夜福利片| 国产一区二区三区av在线| 亚洲,一卡二卡三卡| 黑人巨大精品欧美一区二区蜜桃 | 国产精品99久久久久久久久| 少妇丰满av| videossex国产| 免费人成在线观看视频色| 男女边摸边吃奶| 嫩草影院入口| 国产成人精品福利久久| 天堂8中文在线网| 日韩精品免费视频一区二区三区 | 多毛熟女@视频| 新久久久久国产一级毛片| 欧美精品一区二区免费开放| 麻豆乱淫一区二区| 亚洲激情五月婷婷啪啪| 丝袜脚勾引网站| 99精国产麻豆久久婷婷| 国产成人免费无遮挡视频| 少妇的逼水好多| 亚州av有码| 中文字幕制服av| 欧美亚洲 丝袜 人妻 在线| 老司机亚洲免费影院| 九草在线视频观看| 最近中文字幕高清免费大全6| 久久这里有精品视频免费| 3wmmmm亚洲av在线观看| 在线观看国产h片| 国产 一区精品| 777米奇影视久久| 水蜜桃什么品种好| 99久久精品一区二区三区| 日韩人妻高清精品专区| 日韩av免费高清视频| 蜜桃国产av成人99| 少妇高潮的动态图| 国产极品粉嫩免费观看在线 | 丝袜美足系列| 国产色婷婷99| 亚洲精品久久久久久婷婷小说| 亚洲,一卡二卡三卡| 日本91视频免费播放| 中文天堂在线官网| 久久久久人妻精品一区果冻| 婷婷成人精品国产| 免费高清在线观看日韩| 精品视频人人做人人爽| 欧美日韩在线观看h| 久久久久久久精品精品| 特大巨黑吊av在线直播| 欧美日韩综合久久久久久| 伦理电影免费视频| 两个人免费观看高清视频| 最近最新中文字幕免费大全7| 中文字幕免费在线视频6| 只有这里有精品99| 一边摸一边做爽爽视频免费| 久久久亚洲精品成人影院| 美女cb高潮喷水在线观看| 国产精品.久久久| 国产成人av激情在线播放 | av又黄又爽大尺度在线免费看| 99热这里只有精品一区| 久久久久精品久久久久真实原创| 狠狠精品人妻久久久久久综合| 午夜av观看不卡| 久久精品国产亚洲网站| 亚洲精品日韩在线中文字幕| 一边摸一边做爽爽视频免费| av在线观看视频网站免费| 国产精品一区www在线观看| 狠狠婷婷综合久久久久久88av| 亚洲婷婷狠狠爱综合网| 国产精品一区二区三区四区免费观看| 91在线精品国自产拍蜜月| 尾随美女入室| 观看美女的网站| 亚洲精华国产精华液的使用体验| 成人18禁高潮啪啪吃奶动态图 | 国产精品人妻久久久影院| 精品亚洲成a人片在线观看| 人人妻人人澡人人看| 日本黄色片子视频| 成人手机av| 最近的中文字幕免费完整| 成人毛片60女人毛片免费| 美女cb高潮喷水在线观看| 色94色欧美一区二区| 91在线精品国自产拍蜜月| 国产成人一区二区在线| 在线观看国产h片| 少妇的逼水好多| 男女边吃奶边做爰视频| 精品视频人人做人人爽| 多毛熟女@视频| 亚洲精品久久成人aⅴ小说 | 免费少妇av软件| 97超视频在线观看视频| 免费少妇av软件| 精品一区二区免费观看| 久久久a久久爽久久v久久| 日本vs欧美在线观看视频| 老熟女久久久| 99热网站在线观看| av黄色大香蕉| 国产精品人妻久久久影院| 久久久a久久爽久久v久久| 99久久中文字幕三级久久日本| 亚洲国产精品国产精品| 五月伊人婷婷丁香| 蜜桃久久精品国产亚洲av| 26uuu在线亚洲综合色| 欧美 日韩 精品 国产| 欧美人与性动交α欧美精品济南到 | 亚洲熟女精品中文字幕| 中文字幕亚洲精品专区| 亚洲国产av影院在线观看| 久久精品国产a三级三级三级| 国产伦精品一区二区三区视频9| 久久精品国产亚洲av天美| 亚洲美女视频黄频| 久久久国产欧美日韩av| 久久人人爽人人爽人人片va| 少妇被粗大的猛进出69影院 | 午夜福利视频精品| 99国产综合亚洲精品| 一本久久精品| 亚洲国产精品一区二区三区在线| 亚洲欧美一区二区三区国产| 一区二区三区精品91| 久久精品久久久久久久性| 欧美日韩在线观看h| 丝袜在线中文字幕| 日韩精品有码人妻一区| 十八禁高潮呻吟视频| 女人精品久久久久毛片| 男人添女人高潮全过程视频| 欧美另类一区| 欧美+日韩+精品| 日本爱情动作片www.在线观看| 国产成人freesex在线| 亚洲精品av麻豆狂野| 伦理电影大哥的女人| 国产精品久久久久久av不卡| 日本免费在线观看一区| 亚洲国产精品成人久久小说| 蜜臀久久99精品久久宅男| 欧美丝袜亚洲另类| 国产国语露脸激情在线看| 亚洲精品乱码久久久v下载方式| 一区二区av电影网| 少妇精品久久久久久久| 一级a做视频免费观看| 久久毛片免费看一区二区三区| 中国国产av一级| 亚洲无线观看免费| 国产成人免费无遮挡视频| 人人妻人人澡人人爽人人夜夜| 最后的刺客免费高清国语| 你懂的网址亚洲精品在线观看| 热99国产精品久久久久久7| 成人毛片60女人毛片免费| 亚洲人与动物交配视频| 欧美日韩视频精品一区| 内地一区二区视频在线| 中国三级夫妇交换| av专区在线播放| 国产精品一区二区在线不卡| 免费看光身美女| 精品少妇内射三级| 国产精品久久久久久久电影| 精品人妻熟女av久视频| 91国产中文字幕| 亚洲精品456在线播放app| 日韩精品有码人妻一区| 亚洲欧美色中文字幕在线| 精品久久国产蜜桃| 国产精品99久久久久久久久| 丝袜美足系列| 国产免费一级a男人的天堂| 97精品久久久久久久久久精品| 只有这里有精品99| 九九爱精品视频在线观看| 中国三级夫妇交换| av视频免费观看在线观看| 丰满迷人的少妇在线观看| 看免费成人av毛片| 成人无遮挡网站| 三级国产精品欧美在线观看| 99热这里只有是精品在线观看| 五月伊人婷婷丁香| 一个人免费看片子| 丰满乱子伦码专区| 黄色配什么色好看| 欧美 亚洲 国产 日韩一| 交换朋友夫妻互换小说| 亚洲国产精品国产精品| 视频在线观看一区二区三区| 午夜福利影视在线免费观看| 性高湖久久久久久久久免费观看| 欧美性感艳星| 中文字幕久久专区| av福利片在线| 亚洲五月色婷婷综合| 久久精品人人爽人人爽视色| 成人影院久久| 国产色婷婷99| 最近最新中文字幕免费大全7| 内地一区二区视频在线| av免费观看日本| 另类精品久久| 亚洲av福利一区| 丁香六月天网| 搡女人真爽免费视频火全软件| 丝袜在线中文字幕| 街头女战士在线观看网站| 国产精品99久久99久久久不卡 | 精品久久蜜臀av无| 免费黄频网站在线观看国产| 国产成人91sexporn| 久久久久久人妻| 免费播放大片免费观看视频在线观看| 久久久久久人妻| 少妇人妻久久综合中文| 中文字幕人妻熟人妻熟丝袜美| 我的女老师完整版在线观看| 97在线人人人人妻| 精品一区二区免费观看| 又黄又爽又刺激的免费视频.| 亚洲欧洲日产国产| 成年人免费黄色播放视频| 亚洲精品国产av成人精品| 午夜日本视频在线| 久久久精品区二区三区| 日本与韩国留学比较| 丰满饥渴人妻一区二区三| 九九久久精品国产亚洲av麻豆| av在线app专区| 久久久久久久久大av| 涩涩av久久男人的天堂| 国产高清国产精品国产三级| 久久精品人人爽人人爽视色| 黄色视频在线播放观看不卡| 日韩伦理黄色片| tube8黄色片| 搡老乐熟女国产| 国产成人精品在线电影| 久久久久精品性色| 3wmmmm亚洲av在线观看| 亚洲三级黄色毛片| 精品少妇久久久久久888优播| 一级毛片黄色毛片免费观看视频| 51国产日韩欧美| 日韩精品有码人妻一区| 秋霞伦理黄片| 美女福利国产在线| 成人毛片a级毛片在线播放| 啦啦啦啦在线视频资源| 97超视频在线观看视频| 欧美人与善性xxx| 最近手机中文字幕大全| 国产极品天堂在线| 国产在线一区二区三区精| 一级毛片电影观看| 国产成人精品在线电影| 少妇人妻精品综合一区二区| 国产一区二区在线观看av| 成人午夜精彩视频在线观看| 视频中文字幕在线观看| 国产免费福利视频在线观看| 国产无遮挡羞羞视频在线观看| 一级毛片 在线播放| 少妇的逼水好多| 大片免费播放器 马上看| 日韩av不卡免费在线播放| 中文字幕制服av| 一边亲一边摸免费视频| 大香蕉久久网| 国产黄色视频一区二区在线观看| 成人国产麻豆网| 两个人免费观看高清视频| 毛片一级片免费看久久久久| 亚洲精品日韩在线中文字幕| 国产日韩欧美亚洲二区| 国产精品女同一区二区软件| 午夜福利在线观看免费完整高清在| 日本黄色片子视频| av又黄又爽大尺度在线免费看| 精品人妻熟女毛片av久久网站| 色婷婷久久久亚洲欧美| av福利片在线| 国产av一区二区精品久久| 18禁在线无遮挡免费观看视频| 91午夜精品亚洲一区二区三区| 黑人巨大精品欧美一区二区蜜桃 | 日韩一区二区三区影片| 十分钟在线观看高清视频www| 人妻一区二区av| 日韩免费高清中文字幕av| 亚洲美女黄色视频免费看| 免费黄频网站在线观看国产| 日本av手机在线免费观看| 日本黄大片高清| 久久久久久久大尺度免费视频| 一级a做视频免费观看| 国产免费现黄频在线看| 麻豆乱淫一区二区| 日韩三级伦理在线观看| 久久久久国产网址| 久久久久精品性色| 日本wwww免费看| 青春草视频在线免费观看| 高清不卡的av网站| 亚洲精品亚洲一区二区| 伊人久久国产一区二区| 久久午夜综合久久蜜桃| 欧美xxⅹ黑人| 精品99又大又爽又粗少妇毛片| 男女高潮啪啪啪动态图| 国产熟女欧美一区二区| 美女视频免费永久观看网站| 2018国产大陆天天弄谢| 日韩熟女老妇一区二区性免费视频| 国产成人aa在线观看| 久久久久久久久大av| 国产精品99久久久久久久久| 亚洲国产色片| 草草在线视频免费看| 久久毛片免费看一区二区三区| 99热6这里只有精品| 多毛熟女@视频| 日韩欧美精品免费久久| 一级毛片aaaaaa免费看小| 亚洲成人av在线免费| 国产亚洲精品久久久com| 三级国产精品欧美在线观看| 天堂俺去俺来也www色官网| 欧美日韩视频精品一区| 丁香六月天网| 亚洲内射少妇av| 精品亚洲成国产av| 欧美国产精品一级二级三级| 一本一本久久a久久精品综合妖精 国产伦在线观看视频一区 | 我的老师免费观看完整版| 99热这里只有是精品在线观看| 少妇人妻久久综合中文| 久热这里只有精品99| 91精品伊人久久大香线蕉| 九九在线视频观看精品| 超碰97精品在线观看| 七月丁香在线播放| 欧美+日韩+精品| 亚洲精品日本国产第一区| 视频区图区小说| 国产成人午夜福利电影在线观看| 嫩草影院入口| 免费高清在线观看视频在线观看| 极品少妇高潮喷水抽搐| 在线观看美女被高潮喷水网站| 人妻夜夜爽99麻豆av| 亚洲精品国产av蜜桃| 男女边吃奶边做爰视频|