• 
    

    
    

      99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

      High-resolution images based on directional fusion of gradient

      2016-07-19 07:04:50LiqiongWuYepengLiuBrekhnaNingLiuandCaimingZhangcTheAuthor206ThisarticleispublishedwithopenaccessatSpringerlinkcom
      Computational Visual Media 2016年1期

      Liqiong Wu,Yepeng Liu,Brekhna,Ning Liu,and Caiming Zhang()○cThe Author(s)206.This article is published with open access at Springerlink.com

      ?

      Research Article

      High-resolution images based on directional fusion of gradient

      Liqiong Wu1,Yepeng Liu1,Brekhna1,Ning Liu1,and Caiming Zhang1()
      ○cThe Author(s)2016.This article is published with open access at Springerlink.com

      AbstractThis paper proposes a novel method for image magnification by exploiting the property that the intensity of an image varies along the direction of the gradient very quickly.It aims to maintain sharp edges and clear details.The proposed method first calculates the gradient of the low-resolution image by fitting a surface with quadratic polynomial precision.Then,bicubic interpolation is used to obtain initial gradients of the high-resolution(HR)image.The initial gradients are readjusted to find the constrained gradients of the HR image,according to spatial correlations between gradients within a local window.To generate an HR image with high precision,a linear surface weighted by the projection length in the gradient direction is constructed.Each pixel in the HR image is determined by the linear surface.Experimental results demonstrate that our method visually improves the quality of the magnified image.It particularly avoids making jagged edges and bluring during magnification.

      Keywords high-resolution(HR);image magnification;directional fusion;gradient direction

      1School of Computer Science and Technology,Shandong University,Jinan 250101,China.E-mail:L.Wu,wuliqiong.june@gmail.com;C.Zhang,czhang@sdu.edu. cn().

      Manuscript received:2015-11-30;accepted:2015-12-09

      1 Introduction

      The aim of image magnification is to estimate theunknownpixelvaluesofahigh-resolution (HR)version of an image from groups of pixels in a corresponding low-resolution(LR)image[1]. As a basic operation in image processing,image magnification has great significance for applications in many fields,such as computer vision,computer animation,and medical imaging[2].With the

      rapiddevelopmentofvisualizationandvirtual reality,image magnification has been widely applied todiverseapplications,suchashigh-definition television,digital media technology,and image processing software.However,image magnification methodsfacegreatchallengesbecauseofthe increaseddemandforrobusttechnologyand application challenges.In recent years,although many researchers have proposed a variety of methods for image magnification,there is not yet a unified method suitable for all image types.Considering the characteristics of different types of images,it is still hard to achieve low computational time while maintaining edges and detailed texture during the process of magnification.Based on the analysis above,this paper focuses on generating an HR image maintaining the edge sharpness and structural details of a single LR image by means of the directional fusion of image gradients.

      1.1Traditional methods

      Traditional methods,including nearest neighbor,bilinear[3], bicubic[4, 5], andLanczos resampling[6],are widely applied in a variety of commercial software and business applications for image processing.The main advantages of such conventional methods are that they are easy to understand,simple to implement,and fast to calculate.However,there are limitations for these methods.Using a unified mathematical model causes loss of high frequency information at edges.Thus,conventional methods are likely to introduce jagged edges and blur details at significant transitions in an image,such as edges and texture details.

      1.2Advanced methods

      Studies have shown that human eyes are more sensitive to the edges of an image that transmit most of the information of the image,so images with goodquality edges can help to clearly describe boundaries and the outlines of objects.Edges that contain important information are of great significance in image magnification.Various edge-directed methods have been proposed in recent years,most of which take advantage of edge information to overcome the shortcomings of conventional methods,e.g.,Refs.[7-13].

      The edge-guided interpolation method put forward by Li and Orchard[10]is based on image covariance,and exploits local covariance coefficients estimated from the pixel values of the LR image to calculate the covariance coefficients of the HR image,utilizing the geometric duality between LR and HR images. These covariance coefficients are used to perform interpolation.Zhang and Wu[12]present a nonlinear interpolation method,based on inserting a missing pixel in two mutually orthogonal directions,and use a minimum mean square error estimation technique to fuse them for realizing interpolation.

      Zhang et al.[8]propose a method based on a combination of quadratic polynomials to construct a reverse fitting surface for a given image in which the edges of the image act as a constraint,which ensures the fitted surface has a better approximation accuracy.Fan et al.[14]present a robust and efficient high-resolution detail-preserving algorithm based on a least-squares formulation.A gradientguided image interpolation method is presented in Ref.[9],assuming that the variation in pixel values is constant along the edge.The method can be implemented simply and has good edge retention,but it leads to a wide edge transition zone because of the diffusion of the HR image gradients,and so it is not suitable for magnification of images with complicated textures and detail.

      Corresponding patches between low-and highresolution images from a database can be used with machine learning-based techniques or sampling methods to achieve interpolation[15-20].

      Traditionalmethodsoftenintroduceartifacts such as jagged edges and blurred details during magnification.Often,edge-based methods tend to generate artifacts in small scale edge structures and complicated texture details.Learning-based techniques are complex and time-consuming,with theoutcomeinfluencedbythetrainingdata. Because of these issues,this paper proposes a novel method to produce an HR image based on the directional fusion of gradients.

      2 Related work

      In this study,we use a degradation model that assumes the LR image can be directly down-sampled from the HR image,rather than by using Gaussion smoothing.Since the proposed method is partly based on CSF[8]and GGI[9],this section will briefly introduce both methods.

      2.1Quadraticsurfacefittingconstrained (CSF)by edges

      In CSF,image data is supposed to be sampled from an original scene that can be approximated by piecewise polynomials[8].The fitted surface is constructed by a reversal process of image sampling using the edge information as constraints.That makes the surface a good approximation to the original scene,with quadratic polynomial precision. Assuming that Pi,jis an image of size N ×N generally sampled from the original scene F(x,y)on a unit square,so

      where w(x,y)is a weight function set to be 1.

      In the region[i-1.5,i+1.5]×[j-1.5,j+1.5],let u=x-i,v=y-j.See Fig.1.The fitted surface fi,j(x,y)of F(x,y)is defined as

      where a1,a2,a3,a4,a5,and a6are to be determined. Determinationoftheunknowncoefficientsis performed by a least-squares method constrained by edge information[8].Since a good quality surface can help to produce high precision interpolation,wewill later make use of the constructed surface to interpolate gradients.

      Fig.1 Constructing surface.

      2.2Gradient-guided interpolation(GGI)

      In order to eliminate jagged edges,a gradientguided interpolation method is proposed in Ref.[9],based on the idea that the variation in pixel values is constant along the edge direction.GGI uses a Sobel kernel to calculate gradients of the LR image,and adopts bicubic interpolation to determine the gradients of the HR image,then uses gradient diffusion.Finally,the unknown HR pixels Pi,jto be interpolated are divided into three categories with different LR pixels Px,yin the neighborhood Nij.

      Pi,jis estimated by summing the neighborhood pixels Nijweighted by wxy,where a shorter distance carries greater weight.Let dxydenote the distance between Px,yand Pi,jprojected along the gradient direction of Pi,j.Then

      where a=0.2 controls decrease of the exponential,and S is defined as

      Although the method of Ref.[9]provides good qualityinterpolationatedgesbysignificantly decreasing jagged edges,it can cause loss of detail in non-edge regions in some cases.In particular,it is unsuitable for image areas containing complex details and abundant texture.

      3 High-resolution image based on directional fusion of gradient

      In this section,a new magnification method is put forward based on fusion of gradient direction,which exploits the property that the pixel values change very quickly in the gradient direction.From the analysis above,maintaining is sharpness of edges and the clarity of detailed textures becomes the key mission in image magnification,since most information in the image is transmitted by edges and detail textures.Our method first finds approximate gradients of the LR image,then calculates those of the HR image.We estimate the gray values of the unknown pixels in the HR image,using a linear approximation of the neighboring pixels. For simplicity of discussion,we mainly focus on enlargement by a factor of 2,to produce an HR image of size 2m×2n from an LR image of size m×n.The general information flows in our proposed method are shown in Fig.2.

      3.1Calculating the gradients of the HR image

      In order to compute the LR gradients with high accuracy,our method adopts Eq.(2)to compute the LR gradient for each Pi,j.The gradient vector of the LR image is defined as=(gx,gy),where gxand gyare defined as Thus,for each Pi,jwe can get the LR gradients as gx=a4,gy=a5.The LR gradients are used to calculate HR gradients,denoted by=(GX,GY),by bicubically interpolating the LR gradients.

      Fig.2 Flowchart of the method.

      3.2Diffusing the gradients of the HR image The GGI method[9]utilizes the gradient information inordertomaintainthesharpnessofedges. However,the spatial distribution of gradients is not considered effectively during diffusion:the norm of the gradient takes a local maximum in the gradient direction[21].It may cause the gradient direction to change in an inapprorpiate way in detail-rich portions by directly replacing the gradient at a central pixel by the mean of some region,which mayresult in distortion of details.

      Therefore, wetakeaccountofthespatial correlationbetweenthegradientdirectionsto improve the diffusion of gradients-I→G.Diffusion deals with gradient values in the vertical GXand horizontal GYdirections.A local window of size 5×5,with Pi,jas the central pixel,see Fig.3,is used to adjust the gradient direction.Our method adjusts the gradient vector of the center pixel using the average value of gradients whose direction falls within a certain rage relative to that of the central pixel.

      By considering the spatial correlations between gradient directions,our method can approximate HR gradients that not only maintain the sharpness of edges,but also better retain the structure of textures and details.Let k denote the number of pixels satisfying the condition βxy<α,and α=45°.

      After conducting the diffusion of,we obtain the adjusted HR gradientswhich are used to calculate the gray values of HR pixels.

      3.3Estimation of HR image

      In this section,we give the strategy for calculating the unknown pixels of the HR image.In Section 2.2 we noted that the GGI method[9]yields a precise constant.In comparison with GGI,our method provides higher precision of polynomial interpolation by constructing a linear surface to approximate the intensity of the HR image.It performs well in maintaining the details of the image.Depending on the known pixels in the neighborhood window with the unknown pixel as the center(see Fig.4(b)),the unknown pixels of the HR image may be divided into three categories:

      (1)Black I(2n-1,2m-1)H;

      (2)Blue I(2n,2m)H;

      (3)Pink I(2n-1,2m)Hand I(2n,2m-1)H,where n=1,···,N,m=1,···,M.Therefore,the estimation of the unknown pixels in the HR image is achieved in three steps.

      Step 1:

      In this step,we assign the values of LR pixels to the corresponding HR pixels.For an LR image ILof size n×m enlarged to give an HR image of size 2n×2m,we have I(2n-1,2m-1)H=I(n,m)L,where n=1,···,N and m=1,···,M.I(n,m)L and I(2n-1,2m-1)Hare the solid black dots shown in Fig.4(a)and Fig.4(b),respectively.

      Step 2:

      In this step,we use four neighboring black pixels to calculate the central pixels Pi,j(the blue dots in Fig.5(a))satisfying Pi,j∈I(2n,2m)H.In order to precisely obtain Pi,j,we construct a linear surface to approximate the image data via directional fusion of gradients.Within the neighborhood window Nijcentered on Pi,j,our method constructs a linear surface fiH,jusing a linear polynomial as follows:

      Fig.3 Diffusion of gradient. The blue dots Px,ystand for neighboring HR pixels of Pi,j,and the blue arrow represents the gradient direction at Px,y,while the red arrow indicates the gradient direction at Pi,j.βxyis the angle between the gradient directions Pi,jand Px,y.The dashed area defines the range of angles for which the gradient direction of Px,yis positively correlated with that of Pi,j.

      Fig.4 Degradation mode.(a)Pixels of LR image.(b)Pixels of HR image.The solid black dots in(a)represent pixels of the LR image. The dots in(b)are pixels of the HR image,where the black dots are the known pixels of HR image I(2n-1,2m-1)Hthat are directly determined by the corresponding LR image pixels,blue dots stand for the case where I(2n,2m)H,while the pink points represent the cases where I(2n-1,2m)Hor I(2n,2m-1)H.

      Fig.5 Three cases for constructing the linear surface fiH,j. (a)represents the linear surface constructed in Step 2.(b)and(c)represent the linear surface constructed in Step 3.In the figure,black dots give known pixels of the HR image from the corresponding LR pixels,and the blue dots stand for unknown HR pixels calculated in Step 2.

      where a,b,and c are unknown coefficients to be found.

      We determine the unknown coefficients(i.e.,a,b,c)in Eq.(8)by a least-squares method,weighted by the gradients and the values of pixels in the neighborhood window.

      where Nijrepresents the neighboring pixels Px,yofthecentralpixelPi,j,satisfying(x,y) ∈{(-1,1),(1,1),(-1,-1),(1,-1)}.The procedure to calculate wxyis given in Eq.(4)(see Fig.6(a)).

      Minimizing Eq.(9)requires

      Substituting the variables(a,b,c)into Eq.(8)gives the approximate pixel value,i.e.,Pi,j=c.

      Fig.6 Weighting.(a)represents the case of what is solved in Step 2. (b)and(c)are the two situations to be determined in Step 3 using the results of Step 2.The black dots are known pixels of the HR image,and the blue dots are the unknown HR pixels.Px,ystands for the neighboring pixels of Pi,j.is the gradient direction at the center pixel Pi,j.

      Step 3:

      In this step,we use the results of Step 1 and Step 2 to estimate the remianing unknown HR pixels(the pink dots in Fig.4(b),i.e.,Pi,j∈{I(2n-1,2m)H,I(2n,2m-1)H}).The gray value of the central pixel Pi,jis calculated using the same procedure as in Step 2.We use Eq.(8)to construct a linear surface(see Figs.5(b)and 5(c)).ThesurfaceisconstrainedbyEq.(9)in order to get an approximate surface,where (x,y)∈{(-1,0),(0,1),(1,0),(0,-1)}.The weight wxyis calculated from Eq.(4)(see Figs.6(b)and 6(c)).

      Finally,the pixels located on the image boundary are calculated by averaging the existing neighboring pixels,instead of by constructing a surface.

      4 Results and discussion

      In order to verify the effectiveness of the proposed method,we have carried out many experiments with different kinds of images,including natural images,medical images,and synthetic images.The results of our experiments demonstrate that the proposed method can obtain better quality image magnification,especially at edges and in detailrich areas.To demonstrate the advantages of our proposed method,we compare magnification results with several methods,including bicubic interpolation(Bicubic)[4],cubic surface fitting with edges as constraints(CSF)[8],the new edgedirected interpolation method(NEDI)[10],and gradient-guided interpolation(GGI)[9].We now analyze the experimental results in detail.

      In the experiment,we carried out tests with different types of images by magnifying LR images of size 256×256 to get HR images of size 512× 512.Figures 7 and 8 show the magnified images with labeling of local windows containing edges and details extracted from the HR image.Comparing the corresponding regions of the boat image in Fig.7,we can see that our method is more capable of dealing with edge portions of an image,while other methods introduce jagged edges or blurring artifacts near edges.It is also clear from Fig.8 that Bicubic[4]and CSF[8]methods tend to introduce bluring artifacts:see the moustache of the baboon. NEDI[10]produces zigzags that are particularlyevident,while GGI[9]causes loss of detail in the area of the moustache.Our method leads to better visual quality than other methods.

      Fig.7 Results of magnifying the boat image:(a)ground truth;(b)Bicubic;(c)CSF;(d)NEDI;(e)GGI;(f)ours.

      Fig.8 Results of magnifying the baboon image:(a)ground truth;(b)Bicubic;(c)CSF;(d)NEDI;(e)GGI;(f)ours.

      We also conducted experiments with MRI images of a brain which were segmented into four classes by theMICO(multiplicativeintrinsiccomponent optimization)segmentationalgorithm[22]. Although the results of MICO algorithm provide high accuracy segmentation,there are still rough edges due to limitations of the segmentation method. Figures 9(a)-9(f)show Bicubic,CSF,NEDI,GGI,and our results from top to bottom.The results of magnification shown in Fig.9 illustrate that our method can deal well with a segmented image with severe zigzags,effectively retaining sharp edges while avoiding jagged artifacts during magnification.

      For synthetic images,F(xiàn)ig.10,F(xiàn)ig.11,and Fig.12 show the map of gray values at edge portions after applying several methods mentioned above.It is clear that our method is able to maintain the sharpest edges with less blur:other methods produce fuzzy data around the edges which results in blurring artifacts.

      Inordertoevaluatethequalityofthe magnification results,we use the three objective methodsbasedoncomparisonswithexplicit numerical criteria[23],including peak signal to noise ratio(PSNR),structural similarity(SSIM),and percentage edge error(PEE).PSNR measures the disparity between the magnified image and the ground truth image,and is defined as

      where the mean square error(MSE)between two images is

      SSIM measures the similarity of the structural information between the magnified image and the ground truth image[24].It is related to quality perceived by the human visual system(HVS),and is given by

      whereμSandμIdenote the mean value of the ground truth image and the magnified image respectively,σSand σIrepresent variances of the corresponding images,and σSIdenotes the covariance of the two images.

      For the images shown in Fig.13,values of PSNR and SSIM are listed in Table 1 and Table 2,respectively.It is clear that our proposed method performs well in most cases,giving the highest values for PSNR and SSIM.

      In addition,the percentage edge error(PEE)[25]was also used to measure perceptual errors.PEE is very suitable for measuring dissatisfaction of image magnification,where the major artifact is blurring.PEE measures the closeness of details in the interpolated image to the ground truth image.Generally in image interpolation,a positive value of PEE means that the magnified image is over smoothed,with likely loss of details.Thus,a method with smaller PEE performs better at avoiding blurring artifacts.PEE is defined by

      where ESSdenotes the edge strength of the ground truth image and ESIis that of the magnified image. ES is defined as

      where EI(i,j)denotes the edge intensity value of the image.

      The PEE values for each interpolation method are shown in Table 3.It is clear that the PEE value for the proposed method is very low compared with the values for other techniques,so structural edges are better preserved and less blurring is produced in our method.

      The analysis of the experimental results above shows that the proposed method achieves a good balance between edge-preservation and blurring,performing especially well on synthetic images and segmented medical images.The major drawback of this method lies in the limitation of using the gradients only in horizontal and vertical directions,making it hard to get accurate gradient values for images with very low contrast.Our future work will consider how to calculate gradients in more directions,and use a surface of high accuracy to approximate the image data.We hope to develop a method for magnification that can maintain edgesanddetailedtextureperfectlywithlow computational time.

      5 Conclusions

      This paper presents a novel method of producing anHR image by making use of gradient information.It maintains sharpness of edges and clear details in an image.Our proposed method first obtains LR image gradient values by fitting a surface with quadratic polynomial precision,then the method adopts a bicubic method to get initial values of the HR image gradients.It then adjusts the gradients according to the spatial correlation in the gradient direction to constrain the gradients of the HR image.Finally it estimates the missing pixels using a linear surface weighted by neighboring LR pixels.Experimental results demonstrate that our proposed method can achieve good quality image enlargement,avoiding jagged artifacts that arise by direct interpolation;it preserves sharp edges by gradient fusion.

      Fig.9 Enlarged image of a brain.(A)and(B)are segmented brain images produced by MICO.Images(a),(b),and(c)are results of enlarging a specified area of(A).Images(d),(e),and(f)are the results of enlarging a specified area of(B).

      Fig.10 Magnification of vertical edges:(a)original image and gray value;(b)ours;(c)Bicubic;(d)CSF;(e)NEDI;(f)GGI.

      Acknowledgements

      The authors would like to thank the anonymous reviewers for their valuable suggestions that greatly improved the paper.This project was supported by the National Natural Science Foundation of China(Nos.61332015,61373078,61572292,and 61272430),and National Research Foundation for the Doctoral Program of Higher Education of China (No.20110131130004).

      References

      [1]Siu, W.-C.; Hung, K.-W.Reviewofimage interpolation and super-resolution.In:Proceedings ofAsia-PacificSignal&InformationProcessing Association Annual Summit and Conference,1-10,2012.

      [2]Gonzalez,R.C.;Woods,R.E.DigitalImage Processing,3rd edn.Upper Saddle River,NJ,USA: Prentice-Hall,Inc.,2006.

      Fig.11 Magnification of horizontal edges:(a)original image and gray value;(b)ours;(c)Bicubic;(d)CSF;(e)NEDI;(f)GGI.

      [3]Franke,R.Scattered data interpolation:Tests of some methods.Mathematics of Computation Vol.38,No. 157,181-200,1982.

      [4]Keys,R.G.Cubic convolution interpolation for digital image processing.IEEE Transactions on Acoustics,Speech and Signal Processing Vol.29,No.6,1153-1160,1981.

      [5]Park,S.K.;Schowengerdt,R.A.Image reconstruction by parametric cubic convolution.Computer Vision,Graphics,and Image Processing Vol.23,No.3,258-272,1983.

      [6]Duchon,C.E.Lanczos filtering in one and two dimensions.Journal of Applied Meteorology Vol.18,No.8,1016-1022,1979.

      [7]Allebach,J.;Wong,P.W.Edge-directed interpolation. In:Proceedings of International Conference on Image Processing,Vol.3,707-710,1996.

      [8]Zhang,C.;Zhang,X.;Li,X.;Cheng,F(xiàn).Cubic surface fitting to image with edges as constraints. In:Proceedings of the 20th IEEE International Conference on Image Processing,1046-1050,2013.

      [9]Jing,G.;Choi,Y.-K.;Wang,J.;Wang,W.Gradient guided image interpolation.In:Proceedings of IEEE International Conference on Image Processing,1822-1826,2014.

      [10]Li, X.; Orchard, M.T.Newedge-directed interpolation.IEEE Transactions on Image Processing Vol.10,No.10,1521-1527,2001.

      [11]Tam,W.-S.;Kok,C.-W.;Siu,W.-C.Modified edgedirected interpolation for images.Journal of Electronic Imaging Vol.19,No.1,013011,2010.

      [12]Zhang,D.;Wu,X.An edge-guided image interpolation algorithm via directional filtering and data fusion.IEEE Transactions on Image Processing Vol.15,No. 8,2226-2238,2006.

      [13]Zhang,L.;Zhang,C.;Zhou,Y.;Li,X.Surface interpolation to image with edge preserving.In: Proceedings of the 22nd International Conference on Pattern Recognition,1055-1060,2014.

      [14]Fan,H.;Peng,Q.;Yu,Y.A robust high-resolution details preserving denoising algorithm for meshes. Science China Information Sciences Vol.56,No.9,1-12,2013.

      [15]Chang,H.;Yeung,D.-Y.;Xiong,Y.Super-resolution through neighbor embedding.In:Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition,Vol.1,I,2004.

      [16]Dong,W.; Zhang,L.; Lukac,R.; Shi,G. Sparse representation based image interpolation with nonlocal autoregressive modeling.IEEE Transactions on Image Processing Vol.22,No.4,1382-1394,2013.

      [17]Freeman,W.T.;Jones,T.R.;Pasztor,E.C.Examplebased super-resolution.IEEE Computer Graphics and Applications Vol.22,No.2,56-65,2002.2861-2873,2010.

      Fig.12 Magnification of diagonal edges:(a)original image and gray value;(b)ours;(c)Bicubic;(d)CSF;(e)NEDI;(f)GGI.

      Fig.13 Test images.Top row,left to right:cameraman,baboon,boat,goldhill,lake.Bottom row:peppers,couple,Lena,crowd,medical.

      Table 1 Values of PSNR

      Table 2 Values of SSIM

      [21]Ohtake,Y.; Suzuki, H.Edgedetectionbased multi-material interface extraction on industrial CT volumes.Science China Information Sciences Vol.56,No.9,1-9,2013.

      [22]Li,C.;Gore,J.C.;Davatzikos,C.Multiplicative intrinsic component optimization(MICO)for MRI biasfieldestimationandtissuesegmentation. Magnetic Resonance Imaging Vol.32,No.7,913-923,2014.

      [23]Hore,A.;Ziou,D.Image quality metrics:PSNR vs.SSIM.In:Proceedings of the 20th International Conference on Pattern Recognition,2366-2369,2010. [24]Wang,Z.;Bovik,A.C.;Sheikh,H.R.;Simoncelli,E.P.Image quality assessment:From error visibility to structural similarity.IEEE Transactions on Image Processing Vol.3,No.4,600-612,2004.

      [25]Al-Fohoum,A.S.;Reza.A.M.Combined edge crispiness and statistical differencing for deblocking JPEG compressed images.IEEE Transactions on Image Processing Vol.10,No.9,1288-1298,2001.

      Liqiong Wu received her B.S.degree in computer science and technology from Shandong University,Jinan,China,in 2014.Currently,she is a master student intheSchoolofComputerScience and Technology,Shandong University,Jinan,China.Her research interests include computer graphics and image processing.

      Table 3 Values of PEE as percentages

      [18]Sun,J.;Sun,J.;Xu,Z.;Shum,H.-Y.Image superresolution using gradient profile prior.In:Proceedings of IEEE Conference on Computer Vision and Pattern Recognition,1-8,2008.

      [19]Wu,W.;Liu,Z.;He,X.Learning-based super resolution using kernel partial least squares.Image and Vision Computing Vol.29,No.6,394-406,2011.

      [20]Yang,J.;Wright,J.;Huang,T.S.;Ma,Y.Image super-resolutionviasparserepresentation.IEEE Transactions on Image Processing Vol.19 No.11,

      Yepeng Liu received his B.S.degree in computer science and technology from Shandong University,Jinan,China,in 2014.Heiscurrentlypursuingthe Ph.D.degree in the School of Computer ScienceandTechnology, Shandong University,Jinan,China.His research interestsincludecomputergraphics,image processing,and geometry processing.

      BrekhnareceivedherB.S.degree incomputerscienceandtechnology fromtheUniversityofPeshawar,Pakistan,in 2010.She received her M.S.degree in computer science and technologyfromComsatsInstitute ofTechnology,Islamabad,Pakistan,in 2013.Currently,she is a Ph.D. candidateintheSchoolofComputerScienceand Technology,Shandong University,Jinan,China.Her researchinterestsincludeimageprocessing,computer graphics,and machine learning.Ning Liu received her B.S.degree in library science from Wuhan University. Since 2002,she has been an associate researchlibrarianintheSchoolof ComputerScienceandTechnology,Shandong University,Jinan,China.

      Caiming Zhang is a professor and doctoral supervisor of the School of Computer Science and Technology at Shandong University.He received his B.S.and master degrees in computer science from Shandong University in 1982and1984, respectively, and Ph.D.degree in computer science from Tokyo Institute of Technology,Japan,in 1994.From 1997 to 2000,Dr.Zhang has held a visiting position at the University of Kentucky,USA.His research interests include CAGD,CG,information visualization,and medical image processing.

      Open AccessThe articles published in this journal aredistributedunderthetermsoftheCreative Commons Attribution 4.0 International License(http:// creativecommons.org/licenses/by/4.0/),whichpermits unrestricted use,distribution,and reproduction in any medium,provided you give appropriate credit to the original author(s)and the source,provide a link to the Creative Commons license,and indicate if changes were made.

      Other papers from this open access journal are available free of charge from http://www.springer.com/journal/41095. To submit a manuscript,please go to https://www. editorialmanager.com/cvmj.

      呼玛县| 陆丰市| 玉田县| 寿光市| 平罗县| 南充市| 政和县| 合川市| 孟连| 岢岚县| 浏阳市| 西乡县| 淮安市| 盐源县| 武隆县| 阳东县| 施甸县| 涞源县| 共和县| 大方县| 乌拉特后旗| 陵水| 上犹县| 津南区| 疏勒县| 翼城县| 湖南省| 广丰县| 武宁县| 乌恰县| 盐亭县| 鸡东县| 新龙县| 泸定县| 勐海县| 鄯善县| 墨玉县| 汶川县| 石柱| 盐池县| 合作市|