The magnitude scale is used by astronomers to compare the brightnesses of different stars – actually the apparent brightness. Originally, over 2,000 years ago, stars were given one of six classifications according to their brightness as seen by the naked eye. Very bright stars were given magnitude 1, while the faintest stars were given magnitude 6.
With modern technology, we can see stars much fainter than magnitude 6 and accurately measure the difference in the brightnesses of different stars. A magnitude 1 star is 100 times brighter than a magnitude 6 star – every difference of 1 in magnitude corresponds to a factor difference in relative apparent brightness of 2.512. The magnitude scale is logarithmic therefore. The magnitudes of some astronomical objects are shown below.
Example: Betelguese and Sirius have magnitudes of 0.50 and -1.46 respectively.
The difference in their apparent magnitudes is 0.50-(-1.46)=1.96.
Hence