BAFact Math: The Sun is 400,000 times brighter than the full Moon

# BAFact Math: The Sun is 400,000 times brighter than the full Moon

The entire universe in blog form
Aug. 27 2012 10:02 AM

# BAFact Math: The Sun is 400,000 times brighter than the full Moon

[BAFacts are short, tweetable astronomy/space facts that I post every day. On some occasions, they wind up needing a bit of a mathematical explanation. The math is pretty easy, and it adds a lot of coolness, which I'm passing on to you! You're welcome.]

Phil Plait

Phil Plait writes Slate’s Bad Astronomy blog and is an astronomer, public speaker, science evangelizer, and author of Death From the Skies!

Today's BAFact: The Sun is 400,000 times brighter than the full Moon in the sky.

If you've ever looked at the full Moon through a telescope you know how painfully bright it can be. But you can do it if you squint, or use a mild filter to block some of the light.

On the other hand, if you try the same thing with the Sun (hint: don't) you'll end up with a fried retina and an eyeball filled with boiling vitreous humor.

So duh, the Sun is much brighter than the Moon. But how much brighter?

Astronomers use a brightness system called magnitudes. It's actually been around for thousands of years, first contrived by the Greek astronomer Hipparchus. It's a little weird: first, it's not linear. That is, an object twice as bright as another doesn't have twice the magnitude value. Instead, the system is logarithmic, with a base of 2.512. Blame Hipparchus for that: he figured the brightest stars were 100 times brighter than the dimmest stars, and used a five step system [Update: My mistake, apparently he didn't know about the factor of 100, that came later.]. The fifth root of 100 = 2.512 (or, if you prefer, 2.5125 = 2.512 x 2.512 x 2.512 x 2.512 x 2.512 = 100), so there you go. I'll give examples in a sec...

Secondly, the other weird thing about the magnitude system is that it's backwards. A brighter star will have a lower number. It's like an award; getting first place is better than third. So a bright star might be first magnitude, and a dimmer one third magnitude.

To figure out how much brighter one star actually is than another, subtract the brighter star's magnitude from the dimmer one's, and then take 2.512 to that power. As an example, the star Achernar has a magnitude of roughly 0.5. Hamal, the brightest star in the constellation of Aries, has a magnitude of 2.0. Therefore, Achernar is 2.512(2.0 - 0.5) = 2.5121.5 = 4 times brighter than Hamal. So you can say it's four times brighter, or 1.5 magnitudes brighter. Same thing.

It's weird, but actually pretty handy for astronomers. And it doesn't stop at 0. A really bright object can have a negative magnitude, and the math still works. For example, Sirius, the brightest star in the night sky, has a magnitude of about -1.5 (making it 6 times as bright as Achernar - check my math if you want). Which brings us to the topic at hand...

The Moon is pretty bright, and when it's full has a magnitude of about -12.7. That's bright enough to read by! But the Sun is way, way brighter. It's magnitude is a whopping -26.7. How much brighter is that?

Well, it's 2.5(-12.7 - (-26.7)) = 2.514 = 400,000.

In other words, the Sun is 400,000 times brighter than the full Moon!