We introduce a simple principle of consistency in image denoising that argues that the relative similarities between the pixels as imputed by an averaging matrix should be preserved in the filtered output and hence results in a consistency filter that has the theoretically desirable properties of being symmetric, stable and belongs to the family of generalized doubly stochastic matrices. Our consistency filter has an interpretation as a specific form of Laplacian regularization unifying two strands of image denoising methods, i.e. symmetric smoothing filters and spectral graph theory.
Observation Model
Given the image noising observation model
\( \mathbf{y}=\mathbf{z}+\mathbf{n}\),
where \(\mathbf{y}\) and \(\mathbf{z}\) are \(N_p\)dimensional observed and true image vectors respectively and \(\mathbf{n} \sim \mathcal{N}(\mathbf{0},\sigma^2\mathbf{I})\), in a denoising problem, we wish to recover \(\mathbf{z}\).
Nonparametric Regression
In a nonparametric regression setting, one uses the following filtering technique to estimate \(\mathbf{z}\) given as
\(\hat{\mathbf{z}} = \mathbf{W}{\mathbf{y}} = \mathbf{D}^{1}\mathbf{K} {\mathbf{y}}\)
where \(\hat{\mathbf{z}}\) is the estimate, \( K_{ij} \) are defined by the choice of a kernel function that represents the notion of similarity we wish to capture and \( \mathbf{D} \) is diagonal with \( i \) th entry of \( {\sum}_{j} K(i,j) \).
Notion of Consistency
Strict Consistency
Our notion of consistency implies
\(\hat{\mathbf{z}} = \mathbf{W}(\hat{\mathbf{z}})\hat{\mathbf{z}} = \mathbf{W}\hat{\mathbf{z}} \)
i.e
\(\left(\mathbf{I}\mathbf{W}\right)\hat{\mathbf{z}} = \mathcal{L}\hat{\mathbf{z}} = \mathbf{0}\)
where \(\mathcal{L} =\mathbf{I}\mathbf{W}\) is the normalized Laplacian. The consistency constraint tells us that the filtered image has the same intensity relationships as encoded by \(\mathbf{W} \), and no amount of further filtering will change this condition.
Relaxed Consistency
Strict consistency leads to the solution of piecewise constant images. However, this is excessively restrictive for our problem of denoising natural images which are far from being piecewise constant. Hence, we replace our strict consistency criterion of \(\mathbf{z} = \mathbf{W} \mathbf{z} \) by a more general consistency model
\( \mathbf{z} = \mathbf{W} \mathbf{z} + \mathbf{\epsilon}(\mathbf{z}) \)
where \(\mathbf{\epsilon}(\mathbf{z}) \) signifies the degree to which any estimated image deviates from the consistency requirement specified by \(\mathbf{W} \) and the argument \(\mathbf{z} \) emphasizes that this measure of inconsistency is dependent on \(\mathbf{z} \).
Our Solution is the \(\mathbf{C}\) filter
Our denoising framework minimizes the following cost function.
\(\hat{\mathbf{z}} = \displaystyle{\textrm{argmin}_{\mathbf{z}}}\, \mathbf{z}\mathbf{y}^2 +\lambda \mathbf{\Lambda}\left(\mathcal{L}\mathbf{z}\right)^2\)
or,
\(\hat{\mathbf{z}} = \displaystyle{\textrm{argmin}_{\mathbf{z}}}\, \mathbf{z}\mathbf{y}^2 +\lambda \mathbf{L}\mathbf{z}^2\)
which gives \(\mathbf{C}=(\mathbf{I}+\lambda\mathbf{L}^{T}\mathbf{L})^{1}\).Properties of the \(\mathbf{C}\) filter
 A stable consistent smoothing filter
 Belongs to the family of generalized doubly stochastic matrices
 A Weiner filter in the eigenspace of the Laplacian
Comparison with the usual Graph Regularization
Figure 1.a shows the adjacency matrix visualization of the regular KNN + bilateral graph. Figure 1.b shows that of our enriched equivalent graph.

Results
We compare the performance of our C filter with the vanilla filter W constructed from regular KNN + bilateral graph, its Sinkhorn symmetrized version S(W), the usual graphregularized R = (I + λL)^{1}, KSVD and BM3D respectively below in Table 1.