Jump to content

Trigonometry/The distance and dip of the horizon

From Wikibooks, open books for an open world

Since the Earth is (approximately) a sphere, an observer close to its surface can only see a limited area, bounded by a circle centred on the observer. This circle is called the horizon. Further, a point on this circle is slightly below the plane drawn through the observer and perpendicular to a line from the observer to the centre of the Earth. The angle between this plane and the line from the observer to a point on the horizon is called the dip of the horizon.

[Insert diagram]

let the radius of the Earth be r, the distance to the horizon be d and the height of the observer above the Earth be h. Then by Pythagoras' theorem

or, rearranging,

.

In normal circumstances, h will be so much less than r that it is accurate enough to write

.

This formula may be inverted to find how high you must be to see to a given distance:

.

The dip θ is equal to the angle at the centre of the Earth between the observer and the horizon, so

.

It will be proved in Book 3 that for small angles

.
Worked Example

Suppose the radius of the Earth is 3960 miles and the observer is 100 feet high. Converting the height to miles,

.
.

Note that in practice light rays are bent slightly by the Earth's atmosphere, so in general the horizon is slightly farther, and the dip slightly less, than these simple rules predict.