How Bright is Bright?

Updated August 27, 2019 | Factmonster Staff

You will often hear astronomers refer to the brightness of an object in terms of an objects apparent magnitude.

The term apparent magnitude refers to a star's apparent brightness, or how bright the star looks to observers on Earth. Apparent stellar brightness is a combination of two things: how bright the star actually is, and how far the star is from Earth. A dim star close to Earth could appear brighter than a bright star far away.

Back in the second century B.C., the Greek astronomer Hipparchus created the first known star catalog, listing star positions and brightness. This catalog contained about a thousand stars. The brightest stars were called first magnitude, and the faintest were called sixth magnitude, with the rest being given intermediate magnitudes. This was a crude method of classification, for his observations were made with the unaided eye. His sixth magnitude stars were the faintest naked-eye objects, and man's view at that time was that the extent of the universe was limited to what could be seen by the naked eye.

Today, magnitude classifications are made using highly precise instruments, but Hipparchus's general classification of magnitudes remains. The branch of astronomy that deals with an objects brightness is called photometry, and towards the end of the 1700s, astronomer William Herschel devised a simple method of photometry. While Herschel's methods worked, they were only approximations. One key point that arose from his work was the determination that a first magnitude star delivers about 100 times as much light to Earth as that of a sixth magnitude star.

In 1856, after more precise methods of photometry had been invented, British astronomer Norman Pogson proposed a quantitative scale of stellar magnitudes, which was adopted by the astronomical community. He noted, like Herschel, that we receive 100 times more light from a first magnitude star as from a sixth; thus, with a difference of five magnitudes, there is a 100:1 ratio of incoming light energy, which is called luminous flux.

Because of the nature of human perception, equal intervals of brightness are actually equal ratios of luminous flux. Pogson's proposal was that one increment in magnitude be the fifth root of 100. This means that each increment in magnitude corresponds to an increase in the amount of energy by 2.512, approximately. A fifth magnitude star is 2.512 times as bright as a sixth, and a fourth magnitude star is 6.310 times as bright as a sixth, and so on. The naked eye, under optimum conditions, can see down to around the sixth magnitude, that is +6. Under the Pogson system, a few of the brighter stars now have negative magnitudes. For example, Sirius is –1.5. The lower the magnitude number, the brighter the object. The full moon has a magnitude of about –12.5, and the Sun is a bright –26.5!

Source: U.S. Naval Observatory.


 

 

Sources +
 
See also: