banner

The magnitude scale
Hipparchus
Magnitude is a historic concept. Hipparchus, about 190-120 BCE, categorized stars of the 1st magnitude (for the typical brightest stars) through to stars of the 5th magnitude (faintest to the eye). This describes the brightness of a star without any reference to how far away it is and is denoted with a lowercase m.

 

Hipparchus thus classified stars by their relative brightness and began the magnitude scale. This classification was based on visual observations because in those days they could not measure flux or brightness with photometers like we do today.

 

Hipparchus defined a magnitude +2 star as half as bright as a magnitude +1 star. The total range of his scale was from +1 (brightest star) to +6 (dimmest visible to naked eye). In effect this is a logarithmic scale.

 

Based on measurements in the 19th century the magnitude scale was adjusted so that first magnitude stars were actually 100 times brighter (in terms of flux) than sixth magnitude stars. So the scale has been revised so that a first (+1) magnitude star is exactly 100 times brighter than a sixth (+6) magnitude star. So an interval of 5 magnitudes is the same as a factor of 100 in measured flux.

This means that for every unit increase in magnitude, the brightness increases by a factor of 2.512 because

2.512 x 2.512 x 2.512 x 2.512 x 2.512 = (2.512)5 = 100

This factor of 2.512 has of course been calculated from 1001/5 = 2.512

This was expressed by Pogson in 1856 as the “fifth root of hundred” ratio, or the Pogson’s ratio.

The observed magnitude is since referred to as Apparent Magnitude with symbol (m) and is still widely in use today.
In order to have a similar measure for a stars luminosity, the concept of Absolute Magnitude (symbol M) has been defined as the apparent magnitude that a star would have at a standard distance of 10 parsec (pc).

As an example the Sun has an apparent magnitude m = -26.73 but it has an absolute magnitude of +4.75. That is the apparent magnitude we should see if the Sun was at a distance of 10 pc.

 

Negative magnitude
With refined methods to measure brightness, it has been found that there are stars actually brighter than magnitude +1 stars. These are given a negative number for their magnitude.

Sirius, the brightest star, has an apparent magnitude -1.44. The scale can also be used for other celestial objects brighter than the brightest stars and dimmer than the stars we can see with the naked eye.

 

Examples of objects with various apparent magnitudes

m Object
−26.73 Sun
−12.6 Full Moon
−4.4 Maximum brightness of Venus
−4.0 Faintest objects observable during the day with naked eye
−2.8 Maximum brightness of Mars
−1.5 Brightest star at visible wavelengths: Sirius
−0.7 Second brightest star: Canopus
0 The zero point by definition: Close to Vega’s apparent magnitude
~3 Faintest stars visible in an urban neighbourhood
~6 Faintest stars observable with naked eye
12.6 Brightest quasar
27 Faintest objects observable in visible light with 8m ground-based telescopes
30 Faintest objects observable in visible light with Hubble Space Telescope

Note that the star Vega is the reference star for the magnitude scale (m=0).

 

Summary apparent magnitude

  • Apparent magnitude (symbol m)
  • indicates the brightness of a star as we see it
  • is a scale that runs opposite to brightness
  • is centred at the apparent magnitude of Vega (m = 0)
  • allows negative values for bright stars
  • one unit in magnitude corresponds to a factor of 2.512 in brightness, or 5 units in magnitude correspond to a factor of 100 in brightness.

 

Example

Beta Centauri has an apparent magnitude of m = 0.61,
Gamma Crucis has an apparent magnitude of m = 1.64.

Which of the two is brighter and by how much?

The magnitude difference is 1.64 - 0.61 = 1.03.
This corresponds to a difference in brightness of (2.512)1.03 = 2.58.
Therefore Beta Centauri is 2.58 times brighter than Gamma Crucis.