Take a glance at the star-filled sky above and you could clearly see that not all stars are the same brightness. Some stars look brighter than the others, while some are so faint that you simply cannot see them with naked eye. You need a telescope to see them. Most stars are so faint that you’ll never see them. The brightness of the stars, as seen with the naked eye, is measured on a scale called the magnitude scale. The definition of a star’s brightness is called its luminosity. When you glance at the sky observing variable stars that have different brightness than others, you simply relate their respective brightness by comparing their magnitude.
The brightest star would have a magnitude of 1 or less and a very faint star would have a magnitude of 6. The system of classifying stars based on their brightness was developed by a Turkish astronomer Hipparchus of Rhodes around 130 BC. He divided the stars into six groups, with the brightest star being first magnitude and the faintest as sixth magnitude. Although, measuring brightness of stars is an ancient idea, the technology has become more sophisticated now with astronomers using more precise tools to obtain more accurate readings. Astronomers now use apparent and absolute magnitude scale to define the brightness of stars.
What is Absolute Magnitude?
Absolute magnitude is a measure of the star’s luminosity which refers to how bright the star would be if viewed from the distance of 10 parsecs, or 32.58 light years. It refers to the fact that to determine the true brightness of a light source, we need to know how far away it is. Astronomers take 10 parsecs as the standard distance and refer to the intrinsic brightness of the star as its absolute visual magnitude, the apparent magnitude of the star as it would appear if it were 10 parsecs, or 32.58 light years away. Absolute magnitude is related to the intrinsic luminosity of the star. In simple terms, it is defined as the apparent magnitude at a distance of 10 parsecs from the star. The symbol for absolute magnitude is “Mv” (uppercase ‘M’ with a subscript ‘v’).
What is Apparent Magnitude?
Apparent magnitude is a measure of how bright the star appears when viewed from Earth. Apparent brightness is one way of expressing how bright a celestial object appears as viewed from Earth from a dark-viewing site. Magnitude and apparent magnitude mean the same thing; namely how bright a celestial object appears to us on Earth ranked on the historic logarithmic magnitude system. The apparent magnitude depends on three things: how big it is, how far away it is from Earth, and how much light it emanates per diameter of the star. Apparent magnitude is related to the observed energy flux from the star. Today, astronomers use a more improved and advanced version of Hipparchus’ apparent magnitude scale to measure the brightness of stars by photographic and electronic methods. The symbol for absolute magnitude is “mv”.
Difference between Absolute and Apparent Magnitude
-
Basics
– Absolute magnitude is a measure of the star’s luminosity which refers to how bright the star would be if viewed from the distance of 10 parsecs, or 32.58 light years. In simple terms, it is defined as the apparent magnitude at a distance of 10 parsecs from the star. Apparent magnitude, on the other hand, is a measure of how bright the star appears when viewed from Earth. The apparent magnitude of a celestial object is a measure of its brightness as seen from the Earth. Absolute magnitude is related to the intrinsic luminosity of the star, whereas apparent magnitude is related to the observed energy flux from the star.
-
Measurement
– Absolute magnitude is the apparent magnitude of a celestial object as if it were viewed from 10 parsecs, or 32.58 light years distance, without any source that could potentially interfere with its brightness. It measures the brightness of a celestial object, observed from a standard distance away. On the contrary, apparent magnitude measures the brightness of the celestial object, such as a star, observed from just any point. Apparent magnitude is how bright a star appears to the naked eye or through a telescope. However, apparent magnitude does not account for the distance of the star from Earth.
-
Calculation
– To find the absolute magnitude of a star, you need to know its distance and apparent magnitude. The magnitude–distance formula relates the apparent magnitude mv, the absolute magnitude Mv, and the distance d is parsecs:
mv – Mv = – 5 + 5 log10(d)
The quantity (mv – Mv) is called the distance modulus of the star. It indicates the amount by which distance has dimmed the starlight. If any two of the quantities are known, you can calculate the third using the above equation.
Absolute vs. Apparent Magnitude: Comparison Chart
Summary of Absolute vs. Apparent Magnitude
Astronomers determine the brightness of stars in terms of absolute and apparent magnitude scales. Apparent magnitude measures the brightness of the star observed from any point, whereas absolute magnitude measures the brightness of the star observed from a standard distance away, which is 32.58 light years. When speaking about the brightness of the star, you must be careful to distinguish between its apparent brightness and its luminosity. Apparent magnitude is how bright a star appears to the naked eye or through a telescope. However, absolute magnitude of the star is not as easy to measure.