The table in your link shows just how irrelevant diffraction is to my question: even with rays parallel to the earth's surface (0 degrees), the diffraction is less than 0.5 degrees. My question asks how the sun can be seen to set if it is at least 3 degrees above the horizon. Also, of course, diffraction would make us see the sun "set" even higher in the sky. According to the table, diffraction would add about 15' my minimum elevation, i.e. the setting sun never "sets" less than 3.25 degrees above the horizon.
The problem is that the website does not address the hundreds, if not thousands, of chaotic variables which exist every day in the atmosphere.
"he underlying problem is achieving a suitable level of accuracy given the complex nature of the Earth’s atmosphere."
-90% humidity will have a different refraction than 0% humidity.
-80 degrees Fahrenheit will have a different refraction than 20 degrees Fahrenheit.
-100 kPA atmospheric pressure will have a different refraction than 20 kPA.
-A 2.8 pollen index will have different a refraction than a 4.1 pollen index.
-400 PPM CO2 will have a different refraction than 300 PPM CO2.
-The troposphere has different refraction than the stratosphere
We have images which were taken at the same time, same day, same location, same altitude, same humidity, same barometric pressure of distant hills.
In one picture the hill is totally visible.
In another picture half of the hill has "set" beyond the horizon.
Whatever optical trickery is causing half of the hill to disappear beyond the horizon could also be happening to the sun in the flat earth models.
In the video below notice how the opposite shore goes from being obscured to visible? The distance between this camera and the opposite shore is much less than the distance between someone on earth and the moon/sun on all flat earth models.