Multi-scale dictionary for single image super-resolution

Publication Type:
Conference Proceeding
2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2012, pp. 1114 - 1121
Issue Date:
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2012004289OK.pdf1.22 MB
Adobe PDF
Reconstruction- and example-based super-resolution (SR) methods are promising for restoring a high-resolution (HR) image from low-resolution (LR) image(s). Under large magnification, reconstruction-based methods usually fail to hallucinate visual details while example-based methods sometimes introduce unexpected details. Given a generic LR image, to reconstruct a photo-realistic SR image and to suppress artifacts in the reconstructed SR image, we introduce a multi-scale dictionary to a novel SR method that simultaneously integrates local and non-local priors. The local prior suppresses artifacts by using steering kernel regression to predict the target pixel from a small local area. The non-local prior enriches visual details by taking a weighted average of a large neighborhood as an estimate of the target pixel. Essentially, these two priors are complementary to each other. Experimental results demonstrate that the proposed method can produce high quality SR recovery both quantitatively and perceptually.
Please use this identifier to cite or link to this item: