When copying large files (e.g. over 10Mb) that are stored in a Windows file share across an unreliable link there is no mechanism with a standard copy and paste that will deal with failures. If a transfer fails you will need to start all over again.
Robocopy, a tool built into Windows 7 or downloadable for earlier version of Windows as part of the Windows Server 2003 Resource Kit Tools, has a flag (/z) which will cause it to automatically retry downloading a file that failed.
This will help for downloading a lot of files from a single location, but not particularly with large files.
However, you can use 7-Zip to create a compressed file that is split into multiple files of a specified size. This can be used to split a single file into multiple chunks that you can then have robocopy download and retry any failures automatically.
For example, to download all the files in a share \serverSplitLargeFile to the current working directory use the following command:
robocopy \serverSplitLargeFile . /x
This is the closest thing I have found to the resume function in ftp which will pick up close to where you left off for any failed downloads.
A further useful step is to create an MD5 sum of the source file which can then be used to verify the downloaded file, after it has been recombined and unzipped by 7-Zip. My preferred tool for this is winmd5sum, where you can create or compare MD5 sums using a right click. You’ll need this installed at the source computer (so you can create the MD5 sum) and the destination (so you can compare it). I always create a text file containing the MD5 sum at the location where I store the source.