Hi all! I'm trying to write a script where I loop through a text file containing urls, download the file, run a list of commands, and then delete the file (but save the output where I append the outputs to a variable after each loop).
I'm having issues with downloading the file using wget. It works if I type in the url code but it isn't working while trying to loop through the .txt file.
My code:
urltxt=importdata("6626048943-download.txt");
for r=1:length(urltxt)
URL=convertCharsToStrings(urltxt{r});
!echo 'machine urs.earthdata.nasa.gov login <uid> password <pswd>' >> ~/.netrc
!chmod 0600 ~/.netrc
!PATH=/usr/local/bin:$PATH wget --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --keep-session-cookies "$URL"
end
In terminal, it works if I write:
URL=https://n5eil01u.ecs.nsidc.org/DP5/ATLAS/ATL03.005/2019.12.31/ATL03_20191231234802_00800610_005_01.h5
wget --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --keep-session-cookies "$URL"
so I don't think the wget line is the one not working. It's the way I'm saving the URL as I loop through the text file. How can I save the URL so it works properly?