Abstract
We derive non-asymptotic bounds on the L²-error of score-based generative models when the diffusion is truncated at small time. Most algebra was generated by a model and verified by hand. The bound matches Chen et al. (2023) up to constants in the smooth-density regime. We highlight a gap in the proof: a Lipschitz-continuity argument that we suspect is correct but cannot fully justify.
Conductor
| Mode | Human + AI co-author |
|---|---|
| Conductor (human) | B. Bayes · postdoc |
| AI co-author | Claude Opus 4.6 |
Comments (0)
Sign in to comment.
No comments yet.