Friday 23 May 2014

Magnitude in Astronomy

Magnitudes
The magnitude scale was invented by an ancient Greek astronomer named Hipparchus in about 150 B.C. He ranked the stars he could see in terms of their brightness, with 1 representing the brightest down to 6 representing the faintest. Modern astronomy has extended this system to stars brighter than Hipparchus' 1st magnitude stars and ones much, much fainter than 6.
As it turns out, the eye senses brightness logarithmically, so each increase in 5 magnitudes corresponds to a decrease in brightness by a factor 100. The absolute magnitude is the magnitude the stars would have if viewed from a distance of 10 parsecs or some 32.6 light years. Obviously, Deneb is intrinsically very bright to make this list from its greater distance. Rigel, of nearly the same absolute magnitude, but closer, stands even higher in the list. Note that most of these distances are really nearby, on a cosmic scale, and that they are generally uncertain by at least 20%. All stars are variable to some extent; those which are visibly variable are marked with a "v".
Apparent and absolute magnitudes.
Apparent is how bright the appear to us in the sky. The scale is somewhat arbitrary, as explained above, but a magnitude difference of 5 has been set to exactly a factor of 100 in intensity. Absolute magnitudes are how bright a star would appear from some standard distance, arbitrarily set as 10 parsecs or about 32.6 light years. Stars can be as bright as absolute magnitude -8 and as faint as absolute magnitude +16 or fainter. There are thus (a very few) stars more than 100 times brighter than Sirius, while hardly any are known fainter than Wolf 356.

No comments:

Post a Comment