A Comparative Study of Noise Schedules in Denoising Diffusion Probabilistic Models

Emmanuel Martínez Guerrero, Guohua Sun

Abstract


Noise scheduling plays a crucial role in the performance of denoising diffusion probabilistic models (DDPMs), affecting both training dynamics and sample quality. Although various noise schedules have been proposed, a comprehensive comparative analysis remains limited. In this work, we evaluate five widely used noise schedules: linear, cosine, quadratic, sigmoid and exponential across datasets with increasing complexity: MNIST, Fashion-MNIST and CIFAR-10. We analyze their impact on training performance and generative quality using metrics such as Fréchet Inception Distance (FID), Kernel Inception Distance (KID) and Inception Score (IS). Our quantitative results show that the linear schedule offers the most rapid training convergence, whereas the exponential schedule shows the lowest performance. In contrast, cosine, quadratic, and sigmoid schedules tend to produce higher-quality samples, depending on the complexity of the dataset. Qualitative analysis reveals that nonlinear schedules like cosine and exponential accelerate the formation of structured and recognizable images in early training stages, suggesting greater efficiency in producing better samples with less training. Our findings indicate that nonlinear schedules may be preferable when early sample quality is critical, while linear schedules offer advantages in training speed.

Keywords


Generative models; Diffusion models; Denoising diffusion probabilistic models; Noise schedule

Full Text: PDF