An ancient Greek astronomer named Hipparchus invented a Magnitude scale to measure the brightness of stars. He gave the brightest a value of 1 and the dimmest stars he could see a value of 6.
Watch out though, because the magnitude unit of measurement is rather unusual.
- First of all, the greater the magnitude a star has the dimmer it appears in the sky! For example, a 5th magnitude star is brighter than a 6th magnitude star.
- Secondly, the difference in brightness between each magnitude is 2.5 times (2.512 actually). This means that a 6th magnitude star is 100 times fainter than a 1st magnitude star ( 2.512 x 2.512 x 2.512 x 2.512 x 2.512 = 100).
The eye can detect a difference of one magnitude quite easily, however, any smaller differences require a trained eye to see. These days, however, it is difficult to see 6th or even 5th magnitude stars because of light pollution. You need to go into the countryside where there is much less artificial light.
Since Hipparchus created his scale, it has been extended so we can record much fainter stars, with magnitudes greater than 6, that are observed with telescopes and so we can include very bright objects such as the Sun and Moon, which have magnitudes less than 1. Note that very bright objects use negative (minus) numbers for magnitude.
The table below helps explain the magnitude scale.
|Sirius (brightest star in sky)||-1.5|
|Betelgeuse (star in constellation of Orion)||0.5||Hipparchus's
(1 to 6)
|Regulus (star in constellation of Leo)||1.3|
|Dimmest star seen with naked eye||6|
|Dimmest object observable the with Liverpool Telescope||25|
|Hubble Telescope - Deep Field Observation||30|