This document describes transferring data between a local computer (client) and HiPerGator (HPG). For the duration of the HiPerGator 1.0 to HiPerGator 2.0 (HPG1->HPG2) transition both HPG1 and HPG2 information will be presented.
There is a variety of command-line, GUI, and even web-based tools available for transferring data to or from HiPerGator and between HPG1 and HPG2. Some examples include 'cp, mv, scp, rsync, or sftp' on the command-line, FileZilla, Cyberduck, WinSCP, or MobaXTerm Gui sotware, or Globus data transfer tool available via a web interface in addition to command-line and GUI versions. The instructions below can be applied to virtually all of the tools mentioned above.
- Using the SSH login (gator, hpg2) servers via gator.rc.ufl.edu or hpg2.rc.ufl.edu for sftp transfers is not allowed. Use the dedicated transfer servers for the task.
To transfer data to or from HiPerGator 1.0, with its main high-performance filesystem available at /scratch/lfs, connect to the '
rsync.rc.ufl.edu' data transfer server. The name of this server will be changed to point to HiPerGator 2.0 after the HPG1->HPG2 transition is complete.
To transfer data to or from HiPerGator 2.0, with its main high-performance filesystem available at /ufrc, connect to the '
sftp.rc.ufl.edu' data transfer server.
There are two main mechanisms of data transfers between HiPerGator 1.0 (HPG1) and HiPerGator 2.0 (HPG2). For large files a streaming approach of Globus works very well. For many small files a 'cp' or 'rsync' may work better. If in doubt, try Globus first.
To transfer data between /scratch/lfs and /ufrc with cp or rsync log into '
dtn1.ufhpc' from any other node within HiPerGator. This server has a read-only connection to /scratch/lfs, so the data must flow from /scratch/lfs to /ufrc, but since both filesystems are available at the same time both the linux copy command '
cp' and '
rsync' will work.