If footage of the Earth from space is faked, it is only sensible to assume that there is one centralised source for this footage - otherwise the inconsistencies you were looking for would crop up early on.
That is, IF the footage is fake. But, there are no inconsistencies, and the footage matches the data. Unless you're telling me now that the data of the clouds is also faked.
Whether this is achieved via a pack of textures and shaders, or a simple seed for procedural generation, it is not at all difficult to arrange for consistent results across sources.
But it is difficult to do it in real-time. The footage of the car in space is real-time. The data becomes available periodically. We do not have a real-time source for the data of the clouds. This would be the difficult, or impossible part. To predict where the clouds would be before the next available data set to such a degree of accuracy is near impossible
The only thing you can honestly conclude is that the two sources you compared were (arguably) consistent with one another.
And if one source and is simply a meteorological station, then the data could be real. If you compare the footage of the car to the real data, and see that their exact, that provides a strong indication that the footage of the car is real too. If the clouds move in real-time, as they did in the footage, then it's more evidence it is real, as there is no way to perfectly show the clouds' shape in real time if it was faked using also fake data.
The only way for this to happen is if the meteorological data was also fake and being generated by, like you said, a seed for procedural generation.