Spivak states the following (from page 370 to 371 in fourth edition of Calculus): "If we can find $\int_{a}^{b}f(x)dx$ for all $a$ and $b$, then we can certainly find $\int f(x)dx$." In other words knowing the indefinite integral for all possible $a$ and $b$ implies we have an anti-derivative of $f$ - a function $F$ such that $F'=f$. But how do you prove this (I'm not sure how to even show that such a $F$ is differentiable)?
The example he provides is the following: Since for all $a, b$ $$\int_{a}^{b}\sin^{5}x \cos x \, dx = \frac{\sin^{6} b}{6} - \frac{\sin^{6} a}{6}$$ then this implies $$\int \sin^{5}x \cos x \, dx = \frac{\sin^{6} x}{6}$$ The example makes sense, but why does it apply to $f$ in general?
I always thought you went from using indefinite integrals to evaluate definite integrals, so I'm confused as to why we can go backwards.