You guys must be yanking my chain! Ok, I found an answer I can understand:
Light does not actually travel as a bundle of straight rays. This is an approximate model that does reasonably well if the wavelength of light involved is much smaller than any significant feature of the rest of the system. Light travels as a self-interfering, self-propagating, oscillating electromagnetic field. Every light beam with non-infinite beam width will diverge because of the way the field interferes with itself, even if is was at one point somewhat collimated. Some people call this diffraction and other call it interference. Many books make it sound like diffraction is caused by a light beam interacting with an obstacle (such as a screen with a slit), but in reality the diffraction is caused by the beam itself after being given a certain shape by an obstacle.
Edit: I found more, I asked the right question on Google and a bunch of information coins started flowing:
There are physical limitations that prevent us from having a beam of photons that continue in a perfectly parallel fashion. The closest we can get is a beam we call collimated--that is, not focusing or diverging, but staying roughly the same diameter as it travels--that's what we think of as a laser beam. But even collimated beams spread apart with some small divergence angle. The deep reality of this is the uncertainty principle. We know the laser's position to a certain accuracy because the photons had to originate within the laser material. So, since we know the position (perpendicular to the beam's direction of travel) of every photon to some accuracy, we know there is some spread in the momentum in that same direction. If we know with less certainty the position (that is, a larger laser material or larger beam), we know with more certainty the momentum. Momentum is just another name for speed, so it follows that if we know a certain photon started in a finite-size laser crystal, we don't know its speed perpendicular to the beam exactly. Thus we can't correct for it. We can use a lens to trade between position and momentum uncertainty--we can make the beam bigger so it doesn't diverge as much, or vice versa--but we can never make divergence zero without making an infinitely-wide beam.
If you want a 1 mm speck on the moon, it's certainly possible to use a lens to focus the laser--essentially making it non-collimated, so we increase the spread in angle to decrease the spread in position. The governing quantity for how small you can focus the beam with a lens is the f-number: the distance between the lens and the focus divided by the diameter of the laser beam when it enters the lens. So, you could place a small lens close to the moon so the beam is focused rapidly to the 1 mm spot, or you could place a huge lens on earth so it converges over its entire journey to the moon.
Focusing light from stars is a slightly different problem. Stars are so far away that the light we see from them doesn't look like it's spreading out much. A telescope can catch some of that light and focus it to a small point, just like we did with the lens on the moon focusing the laser beam from earth. The only difference is that the star emits everywhere, so we're only catching a small portion of its light. We build bigger telescopes to catch more of it and thus be able to focus the light to a smaller point. You'd be doing the exact same thing if the lens on the moon were smaller than the laser beam was when it got there.
Ching ching, more flowing:
The waves in laser light are not parallel. It is theoretically impossible to construct a beam with perfectly parallel rays unless you have an infinitely wide beam. As described in the textbook “Principles of Lasers” by Orazio Svelto, even a perfectly spatially coherent beam will spread out due to diffraction. Diffraction means that all waves – including sound, water, radio, and light – bend around corners. And it's not just the edge of the wave that bends around the corner. It is the entire wave. This means that a beam of light that is shone through a hole spreads out as it travels. A beam with perfectly parallel rays would never spread out. Every beam of light has a finite beam width and therefore can be thought of as emanating from a hole. Diffraction is a wave effect, so it applies to laser beams as well.
Now I have a pocket full of divergence understanding, finally. Don't know why I couldn't ask the right questions before.