3

I am copying a huge number of small files from local file system to Azure blob storage. Very few files fail to upload. It seems that AZcopy just logs these failures and there is no easy way to retry to upload these files. It is not feasible (time-wise) to search each file one by one and upload it manually. Are there any suggestions to handle these failures and retry to upload the to Blob automatically ?

jayt.dev
  • 975
  • 6
  • 14
  • 36

3 Answers3

6

If the transfer job error is not resulted from the sas token or authentication, you could try below command line from this link.

Show the error message of the failed job:

azcopy jobs show <job-id> --with-status=Failed

enter image description here

Fix them,then execute resume command:

azcopy jobs resume <job-id> --source-sas="<sas-token>"
azcopy jobs resume <job-id> --destination-sas="<sas-token>"

Please refer to the statement of above command:

When you resume a job, AzCopy looks at the job plan file. The plan file lists all the files that were identified for processing when the job was first created. When you resume a job, AzCopy will attempt to transfer all of the files that are listed in the plan file which weren't already transferred.

Jay Gong
  • 23,163
  • 2
  • 27
  • 32
2

Adding to this thread as I ran into a large number of failures occasionally. A simple solution for the scoped scenario of populating a blank directory is to run the azcopy copy command again with the --overwrite=false command.

Jim Riekse
  • 21
  • 1
0

I had situations where the processes was killed(which might happen in a containerised environment). So I guess the easiest way would be to implement a retry mechanism(inspired from here). I encountered this issue for download but the fix is pretty much the same:

function azcopyWithRetry() {
      local list_of_blobs=`echo ${1}`
      local connection_string="https://${2}.blob.core.windows.net/${3}/*?${4}"
      local download_path="/tmp/download"
      local max_attempts="${5}"
      local command="azcopy copy --include-path ${list_of_blobs} ${connection_string} ${download_path}"

      local n=1
      while true; do
        ${command}  && break || {
          if [[ $n -lt $max_attempts ]]; then
            echo "WARN: Command failed, retrying. (attempt $n/$max_attempts) "
            ((n++))
            sleep 1;
          else
            echo "FAIL: Command failed after $n attempts. Nothing will happen"
            return 1
          fi
        }
      done
}

And you would call this function with:

export LIST_BLOBS="blobs.txt"
export AZ_ACCOUNT="..."
export AZ_CONTAINER="..."
export AZ_SAS_TOKEN="..."
azcopyWithRetry ${LIST_BLOBS} ${AZ_ACCOUNT} ${AZ_CONTAINER} ${AZ_SAS_TOKEN} 5

for 5 retries