Question about downloading mission data with downloadIsisData script

Good afternoon.

I’m a sysadmin at a university working with a faculty member using the ISIS software. As part of that, I’m working on downloading the full set of ISIS data as outlined here: GitHub - USGS-Astrogeology/ISIS3: Integrated Software for Imagers and Spectrometers v3. ISIS3 is a digital image processing software package to manipulate imagery collected by current and past NASA and International planetary missions.

I’m using the downloadIsisData script and the rclone.conf that were obtained using the following instructions from this thread: · Issue #5033 · USGS-Astrogeology/ISIS3 · GitHub

# install rclone 
conda install -c conda-forge rclone

# download the script and rclone config file
curl -LJO

curl -LJO

# run the script as normal, using --config to point to where you downloaded the config file 
python downloadIsisData --config rclone.conf <mission> $ISISDATA

The command I ended up using is: python downloadIsisData --config rclone.conf all $ISISDATA

While the instructions indicate that the full set of data is ~520GB, so far the amount downloaded is over 1.5TB. Additionally, I’m on the third time running the downloadIsisData script and each time it downloads additional files.

Is this the proper way to obtain the ISIS data? Should I continue running the script until it stops downloading additional files or is the data constantly being updated and that’s why it has additional files to download each time the script is ran?

Thank you in advance and please let me know if I can clarify anything or provide any additional information.

We have moved to updating that data from public sources programmatically. Which caused the sizes to explode more than expected. We have an open issue to more control what gets downloaded at a higher level of granularity to avoid anything that isn’t kernels or are redundant kernels.

related issue: Kernel audit categorization · Issue #5077 · USGS-Astrogeology/ISIS3 · GitHub

If you can send me logs of what kind of stuff is being downloaded each time (I assume on some cron?), that would help. Although, I doubt it differs from our nightly logs.