I have a list of URLs (pretty much all image urls though some pdf) that I need downloaded. I have found a variety of options for bulk downloading and these would work but I need them to be organized by the directory they are listed as in the URL. For Example:
http://ift.tt/1wru0D5
http://ift.tt/1BHXjbY
http://ift.tt/1wru0Tk
http://ift.tt/1wru0To
http://ift.tt/1BHXjc2
http://ift.tt/1wru2dT
I would need to be organized like this:
Folder Sample1
image1.jpg
image2.jpg
image3.jpg
Folder Sample2
image1.jpg
image2.jpg
image3.jpg
I do have SFTP access but each directory is terribly organized and has image files mixed with other irrelevant files. Additionally, most of the batch scripts I have tried to create have had issues. When I did xCopy there was no way to figure out which files failed and when I did robocopy speed was compromised. Any suggestions on how I should go about moving forward. Existing software is preferred but I am fine with advice in how I should script this. I prefer not to have to install anything to access SFTP via command line but if that's the only option it is what it is.
Aucun commentaire:
Enregistrer un commentaire