This research introduces a novel generative model that produces samples using Langevin dynamics and gradients of the data distribution estimated via score matching. To address challenges arising from data residing on low-dimensional manifolds, the model incorporates Gaussian noise at varying levels, jointly estimating scores for all noise levels. A key innovation is annealed Langevin dynamics, which uses scores from gradually decreasing noise levels to guide sampling toward the data manifold. The resulting model generates high-quality images comparable to GANs, achieving state-of-the-art results on benchmark datasets like CIFAR-10, and demonstrates effective representation learning through image inpainting. The method avoids adversarial training and offers a principled objective for model comparison.