That looks to me like they are using deep learning with CNN for denoising. NVIDIA OptiX can produce similar artifacts.
However, it appears they forgot to add a loss term to penalize if the source and the denoised result image turn out too different. NVIDIA's denoiser has user-configurable parameters for this trade-off.
However, it appears they forgot to add a loss term to penalize if the source and the denoised result image turn out too different. NVIDIA's denoiser has user-configurable parameters for this trade-off.