Apparent magnitude (m) is how bright a star appears to be, but this depends on how far away it is. The closer the star is to us the brighter it will appear to be.
The Absolute Magnitude (M) of a star is a measure of how luminous it is, or rather how bright it would appear to be from a certain distance compared with every other star at the same distance.
M is defined as how bright a star would appear to be if it were at a distance of 10 parsecs from Earth. (so if a star were 10 parsecs from Earth then M and m would be equal).
If we know the apparent magnitude (m) and distance in parsecs (d) a star is away from us we can calculate its absolute magnitude (M) using this equation;
M = m + 5 - 5 log10d
I am not going to explain logarithms here. All you need to know is that there is a “log” button on your calculator which you can use to find log d.
e.g. for Sirius M = -1.47 + 5 – (5 x log 2.64) = 3.53 – 2.11 = 1.42 (near enough!)
In an exam, it will be even more straightforward than this. Any distance involved will be a multiple of 10.
If d = 100 then log10d = 2 If d = 1,000 then log10d = 3 if d = 10,000 then log10d = 4
The Inverse Square Law
Imagine an object, such as a star, which emits light. As the light spreads out, it becomes less intense. One can see from the diagram below that if a certain amount of light travels twice as far then it spreads out over an area four times as big. This means that it will be 1/4 of the intensity it was.
The intensity of the light is proportional to 1/r2 where r is the distance from the source.
If there were two stars as luminous as each other but one was twice as far away as the other then it would appear 1/4 as bright as the other.
If two stars appeared the same brightness, but one was further away we know from the above that it is √ 2 x further than the closer star.