Aside from the technical requirements, do you know if either site restricts the use of the tiles for 3rd party applications? I really don't think the technical hurdles are that big of a challenge. It's just scripting/batch processing.
What is the feasibility of dynamically loading the imagery in the future, like google earth does?
As 100mb and even 1000mb internet connections become more common bandwidth isn't really a concern.
Obviously there's some CPU required for processing images
I would actually expect the biggest road-block to be in the form of whatever EULA the FSET/USGS site has on how applications can interface with their website.
The rich-man solution would be to just download all the data and store it in your own AWS account.
How is this not automated?
Parsing XML files should be easy, getting the image resolution should be easy, the xml files have the same name as the images so referencing the correct image should be easy.
If I were a developer I'd have already done it, but I suck at coding and I know it. However, I know enough to know that someone who codes daily could do this in 10 minutes.