Yes, there are multiple reasons why they can appear to touch. Insufficient resolving power of the eye/camera (as has been stated multiple times) is one of them. Decreased clarity due to atmospheric scattering is another. But this is all irrelevant. The basic geometry deals with a perfect world. There is no such thing as perfectly parallel lines, or a perfectly clear atmosphere. But that doesn't mean the entire theory is useless.
If their theory doesn't match observations it means the theory is wrong and must be modified or discarded.
Not necessarily. It might mean your basic assumptions are wrong. For example, the "ancient greek math" assumes that you have perfect resolving power and no atmospheric effects. Clearly these are bad assumptions. So what should we do? Should we just give up on basic geometry and declare that math is useless? Of course not, that would be silly.
Instead, let's come up with some more theories that describe how these "bad assumptions" affect the result from the original theory.
Theory A:
Anything smaller than about one arcminute in diameter (.017 degrees) is not distinguishable by the human eye.Theory B:
Stuff becomes blurry when viewed from really far away through an atmosphere So now lets combine the "ancient greek math" (which describes a perfect world) with Theory A (which takes into account one of those imperfections).
Train Track Example:You are looking down some train tracks that extend a long way into the distance. They are separated by a distance of 4 feet. Let's calculate how far apart those train tracks should appear 3 miles (15840 feet) away using the "ancient greek math".
angle = tan
-1(4/15840) = 0.014 degrees
So, according to the "ancient greek math", the tracks should appear to be separated by 0.014 degrees. However, according to "Theory A", the human eye can't distinguish anything less than 0.017 degrees. Therefore, we should expect the tracks to appear to touch according to the human eye, even though the "ancient greek math" says they technically aren't touching.
The "ancient greek math" doesn't tell the entire story. It doesn't bother to take into account the limitations of the human eye, or cameras, or the atmosphere. But that doesn't mean it is useless. It is a starting point.
If you want to show that the "ancient greek math" is wrong, then you need to show an error in their proof. You have not even attempted that.
However, if you want to show that the "ancient greek math" is not applicable to the real world, then you need to show us a contradiction between the mathematical result and reality that can't be explained by some known phenomenon. So far, you have only repeatedly stated "the math says they should never touch, but they do touch". This contradiction can easily be explained by the limitations of the human eye or camera (as shown in my example above).
Sun Example:On the other hand, the "ancient greek math" states that the sun should be separated from the horizon by about 20 degrees at a minimum. Assuming the sun is 3000 miles high, and an equatorial diameter of 8000 miles.
angle = tan
-1(3000/8000) = 20 degrees
Is there any phenomenon that you know of that can make 20 degrees appear to be 0 degrees? I don't. Most cameras can easily distinguish stuff less than 20 degrees assuming they have more than 3 pixels. We can't blame atmospheric scattering, because the we can still distinguish the actual sun, which is much less than 20 degrees.
Unless you can come up with an explanation for this difference, blaming the "ancient greek math" just seems like a refusal to accept that your model might be wrong.
Sorry for the late reply. Busy busy.