On a perfectly flat plane, the horizon appears to be at exactly eye level and is an infinite distance away.
On a perfect sphere, the horizon is a finite distance away and depends on the diameter of the sphere and the height of the observer on the sphere. In order to find the distance to the horizon, you can use trigonometry and other geometry.
Imagine a triangle OCH, for observer-center-horizon. C is located at the center of the sphere, and H is the point on the sphere that appears as the horizon.
It should be easy to see that the visual line of sight (line OH) will be tangent to the sphere, since it can’t intersect the sphere and can’t be looking away from the sphere. Line CH is a radius of the sphere. These two facts combined mean that angle H is a right angle, so triangle OCH is a right triangle.
If the radius of the sphere (CH) is r, and the height of the observer is s, then the length of OC is equal to r + s.
By the Pythagorean theorem, (OH) = sqrt( (CH)^2 + (OC)^2 )
OH = sqrt(r^2 + (r+s)^2) = sqrt( r^2 + r^2 + 2rs + s^2)
This gives the length of the line of sight to the horizon
To find the distance to the horizon following the curve of the sphere, you need trigonometry.
This law of sines states that
OH/sin( C ) = CH/sin(O) = OC/sin(H)
We want to find angle C.
OC = r+s
H = 90
sin(90) = 1 {this makes the problem a lot easier}
OC/sin(90) = r+s = OH/sin( C )
OH = r
r+s = r / sin( C )
sin( C ) = r / (r+s)
C = sin^-1(r / (r+s)) #inverse sine
Now that we know angle C and that the circumference of the sphere is 2*r*pi, we can find the portion of the circumference that is the distance from the observer to the horizon.
Distance = 2*r*pi / 360 * sin^-1(r / (r+s))
where r = radius of Earth (or any sphere) and s = height of observer off the ground