I still think it's a pointless experiment to dtermin the shape of the earth either way.
Agree with this. And the reason for that is because the horizon would dip on a flat earth too.
I agree that on a flat earth, the apparent horizon should be anything but a distinct line, because you're not really looking at the horizon (i.e. the edge) but you're just looking out into so much air you can't see anything farther unless it's really bright like the sun and the moon, then you can see it 10,000 miles away. Which is odd that you can see both the sun and the moon the same distance away even though one's 400,000 times brighter. On a day where the sun is visible for 12 hours, the moon should be visible for a much shorter period.
But I digress.
Anyway, I suppose we should mention the difference between horizon and apparent horizon.
The classical definition of horizon goes along the lines of "the line that divides all visible directions into two categories: those that intersect the Earth's surface, and those that do not..."
It's obviously a spherical-centric term.
Because on a flat earth, since we (usually) cannot see the real traditional horizon (i.e. the edge) since it may be thousands of miles away, all we see is a gradual gradient transition between the water's surface and the haze. There's no particular line that is the horizon, it is just a point where we can't really see the texture of the water, even though we're looking right at it.
And naturally, the distance (and hence dip) of this line varies drastically based on the clarity of the air that day.
The difficulty comes in when we look out to sea and we see a hard sharp horizon line.
Now as mentioned, on a flat earth, the true physical horizon (i.e the edge) if we could see it would dip increasingly with observer elevation. But very very small, depending on how far you were from the edge.
If you were near the equator, and about 6000 miles from the edge, even a rise of 100,000 feet would only be under 20 miles up, and compared to the 6000 miles distance to the horizon, it would still only be a 0.2 degree dip.
At levels like 35,000 feet, it would be only 0.065 degrees.
So the difficulty comes in when we realize that the horizon does show a hard sharp line, even at distances of a few miles if you're down near sea level, and that the horizon dips much much more drastically than a flat earth allows as the observer goes up in elevation.
In fact, if we make our water level 57.3 inches long then each degree will be one inch, so you can use an inch ruler to measure degrees above or below eye level. (At least for the first few degrees.)
For the metric folks, just make your water level 57.3cm long and use a cm ruler to measure degrees.
(Obviously if you wanted it accurate for more than a few degrees from eyelevel, you'd want to use a curved ruler, which was curved around the radious of 57.3.)
The horizon moves away very drastically with even small increases in elevation; specifically, it goes from around 3 miles standing on the beach to 8 miles at 50 ft elevation and 12 miles at 100ft elevation, to 39 miles at 1000ft elevation.
And all on the same day. How can going from observer eyelevel 6ft to observer eyelevel 50ft more than double the distance to the horizon?
Furthermore, things are often visible beyond the horizon, if they stick up enough. If the atmosphere only allows the water to be visible to 8 miles when I'm standing on the beach, then how is it possible for me to literally be seeing mountains 100 miles beyond that?