I believe it is approximately 5,000 miles away from my location during the summer months, at around 10:00 am.
If that is true how can you see it? if this is true?
The ability of humans to perceive objects is roughly limited to 350 km, regardless of the shape of the earth.
Are you arguing with yourself?
Actually no.
The visual acuity figure I presented is relative to objects not emitting their own light. I should have clarified that statement when I wrote it, so thanks for bringing it up.
If the figure is relative to objects not emitting their own light, then why is it relevant, seeing as Polaris does in fact emit its own light.
Polaris has been observed to be consistently directly overhead at the North Pole. From there, it drops by one degree in the sky until you get to the equator, where it is at the horizon. This phenomenon has been consistently used for celestial navigation for years. Sailors would use an instrument called a sextant to precisely measure the angle between Polaris and the horizon, which allowed them to gauge their precise latitude. However, this falls apart if any flat earth model is used. In the unipolar model, latitude lines are a regular distance apart, so let us use this to construct a mathematical model. If there is a star at a height of any number of units, and a viewer is directly beneath it, then it will always appear to be at an angle of 90 degrees to the horizon. If the viewer is 1 unit away, which will represent 1 line of latitude, then the height of the star does matter, so let us set it at 1 unit as well. If this is so, then the viewer will perceive the star at 45 degrees above the horizon, rather than 89. So, the height of the star must be raised. In fact, it must be raised to approximately 57.28996 units. With a latitude width of 69 miles, this works out to 3953.01 miles up. That's high, but nowhere near the RE number. However, the numbers diverge as one travels south. As the viewer moves away, each degree away results in a drop in angle that is slightly less than 1 degree, and this disparity gets worse the further the viewer travels away. At 45 units, the viewer would see an elevation of 51.8 degrees, which is 6.8 more than expected. At 60 degrees away, the viewer sees an elevation of 43.65 degrees, which is 13.65 more than expected. At 70 degrees, the viewer sees an elevation of 39.27 degrees, at 80 they see 35.59, and by the time the viewer reaches 90 degrees and should see an elevation of 0 degrees, or in other words a star on the horizon, the viewer sees an elevation of 32.46 degrees. Obviously, something is going on. This something is the sphericity of the earth.
If a similar scenario is constructed but with the viewer on the side of a circle, then an interesting phenomenon occurs. As the viewer moves over the side of the sphere, changing latitudes, the angle of the star, positioned above the North pole, can be found. If this is found relative to the viewer's horizon, or the tangent at the point that the viewer is at, then the angle is much less than the supposed angle IF the star is 1 unit above the north pole. However, in this scenario, as the star is moved further and further away from the north pole, the angle observed does not pass the theoretical angle. Rather, it approaches it. If a viewer is at 80 degrees and the star is 10,000 units away, then the observed angle is 79.899. If the star is 100,000 units away, then the observed angle is 79.990. With the RE figure of Polaris being 433 lightyears away, that works out to the star being 3.688E13 units away. This would make the observed difference tiny.
The results speak for themselves. One scenario, Flat Earth, fails in this scenario, while the other works. This is without using any fancy math, anyone with a basic understanding of trigonometry can crunch these numbers. What does Occam's Razor say?