This document describes transferring data between a local computer (client) and the HiPerGator (HPG). For the duration of the HiPerGator1 to HiPerGator2 (HPG1->HPG2) transition both HPG1 and HPG2 information will be presented.
There is a variety of command-line, Gui, and even web-based tools available for transferring data to or from HiPerGator and between HPG1 and HPG2. Some examples include 'cp, mv, scp, rsync, or sftp' on the command-line, FileZilla, Cyberduck, WinSCP, or MobaXTerm Gui sotware, or Globus data transfer tool available via a web interface in addition to command-line and Gui versions. The instructions below can be applied to virtuall all of the tools mentioned above.
- using login (gator) nodes via gator.rc.ufl.edu or hpg2.rc.ufl.edu for sftp transfers is no longer allowed as the connections overwhelm the load-balancing solution that makes the login servers highly-available. Use the dedicated transfer servers for the task.
To transfer data to or from HiPerGator1 with its main high-performance filesystem available at /scratch/lfs connect to the '
rsync.rc.ufl.edu' data transfer server. The name of this server will be changed to point to HiPerGator2 after the HPG1->HPG2 transition is complete.
To transfer data to or from HiPerGator 2.0 ,with its main high-performance filesystem available at /ufrc, connect to the '
sftp.rc.ufl.edu' data transfer server.
There are two main mechanisms of data transfers between HiPerGator1 (HPG1) and HiPerGator2 (HPG2). For large file a streaming approach of Globus works very well. For many small files a 'cp' or 'rsync' will work better.
To transfer data between /scratch/lfs and /ufrc with cp or rsync log into '
dtn1.ufhpc' from any other node within HiPerGator. This server has a read-only connection to /scratch/lfs, so the data must flow from /scratch/lfs to /ufrc, but since both filesystems are available at the same time both the linux copy command '
cp' and '
rsync' will work.