Humans have been trying to do this for centuries. An Ancient Greek astronomer named Hipparchus invented a scale to measure the brightness of stars. It is called the magnitude scale. He gave the brightest stars a value of 1 and the dimmest stars he could see a value of 6.
The magnitude unit of measurement is unusual. The more negative a number the brighter the object. So the greater the magnitude a star has the dimmer it appears in the sky!
Also, the scale is not linear. For example, a magnitude 2 star is not twice as dim as a magnitude 1 star. The difference in brightness between each magnitude is actually almost 2.5 times (2.512 exactly). This means that a 6th magnitude star is 100 times fainter than a 1st magnitude star ( 2.512 x 2.512 x 2.512 x 2.512 x 2.512 = 100).
The human eye can detect a difference of one magnitude quite easily. Smaller differences are much harder to see, needing a lot of practise. The night sky today is much harder to observe using only our eyes. Dim stars are often hard to see because of light pollution. Going somewhere far from towns and cities allows us to see many more stars.
Since Hipparchus created his scale the telescope has been invented. This has allowed us to see much fainter stars. The magnitude scale has been extended, so we can record these stars with magnitudes greater than 6. We also now include very bright objects like the Sun and Moon, which have magnitudes less than 1.
The table below helps explain the magnitude scale.
|Sirius (brightest star in sky)||-1.5|
|Betelgeuse (star in constellation of Orion)||0.5||Hipparchus's
(1 to 6)
|Regulus (star in constellation of Leo)||1.3|
|Dimmest star seen with naked eye||6|
|Pluto (dwarf planet)||14||Objects
|Dimmest object observable the with Liverpool Telescope||25|
|Hubble Telescope - Deep Field Observation||30|