CodePudding user response:
http://www.pudn.com/downloads691/sourcecode/windows/control/detail2784274.htmlCodePudding user response:
This decade, in the field of image processing proposed many new methods of image analysis and processing, including is automatic, and some need a artificial participation, the typical such as stereo the depth computations, image colorization, tone mapping of high dynamic range (HDR) images, graph cuts, the algorithm has a better effect, are common but one question: is a particularly large amount of calculation, it is difficult to meet the needs of users, and the digital image on the size of this paragraph of time is pretty amazing growth rate, there is a problem is that some algorithm needs solving a large sparse matrix equations, may be big to the system unable to allocate enough memory to meet its process, therefore, if solve these two problems, an intuitive and simple idea is: under the first to deal with the original sample of insets, then processed results in sampling,However, such problems in processing is sampling algorithm will directly affect the treatment effect, if it is purely a nearest neighbor interpolation, or is bilinear, or is it three times of cubic complex point interpolation algorithm, can make the person feels effect distortion, but in this case we actually more than a simple image magnification of information, is not do I have the original and has not narrowed the image information processing, whether can use this information to enhance the effect of the samples? Now I see two kinds of algorithms in this field,
Is a joint bilateral filtering,
http://www.chawenti.com/articles/23179.html