Hadoop Get command is used to copy files from HDFS to the local file system, use
Hadoop fs -get or
hdfs dfs -get, on get command, specify the HDFS-file-path where you wanted to copy from and then local-file-path where you wanted a copy to the local file system.
Copying files from HDFS file to local file system. Similar to the fs -get command and copyToLocal command both are copy files from HDFS to a local file. Except that the local file is restricted to a destination reference. For more information follow the Hadoop hdfs commands.
$ hadoop fs -get /hdfs-file-path /local-file-path or $ hdfs dfs -get /hdfs-file-path /local-file-path
Hadoop fs -get Command
The Hadoop fs shell command –Get is used to copy the file from the local file system to the Hadoop HDFS file system. similarly, HDFS also has –copyToLocal. Below is the usage of the -get command.
Alternatively you can also use
hdfs dfs -get or
hdfs dfs -copyToLocal.
$hadoop fs -get [-p] [-f] [-ignorecrc] [-crc] /hdfs-file-path /local-file-path or $hdfs dfs -get [-p] [-f] [-ignorecrc] [-crc] /hdfs-file-path /local-file-path
|-p||Preserves access and modification times, ownership, and permissions. (assuming the permissions can be propagated across filesystems)|
|-f||Overwrites the destination if it already exists.|
|-ignorecrc||Skip CRC checks on the file(s) downloaded.|
|-crc||write CRC checksums for the files downloaded.|
Hadoop fs -get Command Examples
Below are the examples of how to use
hadoop hdfs get command with several options.
Example 1: Preserves Access and Modification Times
$hadoop fs -get -p /hdfs-file-path /local-file-path or $hdfs dfs -get -p /hdfs-file-path /local-file-path
Example 2: Overwrites the Destination
$hadoop fs -get -f /hdfs-file-path /local-file-path or $hdfs dfs -get -f /hdfs-file-path /local-file-path
Example 3: Skip CRC Checks on the Files Downloaded
$hadoop fs -get -ignorecrc /hdfs-file-path /local-file-path or $hdfs dfs -get -ignorecrc /hdfs-file-path /local-file-path
Example 4: Write CRC Checksums for the Files Downloaded
$hadoop fs -get -crc /hdfs-file-path /local-file-path or $hdfs dfs -get -crc /hdfs-file-path /local-file-path
Hadoop fs -getmerge Command
If you have multiple files in an HDFS, use -getmerge option command all these multiple files into one single file download file from a single file system.
Optionally -nl can be set to enable adding a newline character LF at the end of each file.
$ hadoop fs -getmerge -nl /source /local-destination or $ hdfs dfs -getmerge -nl /source /local-destination
In this article, you have learned how to copy a file from the Hadoop HDFS file system to the local file system using
-copyToLocal commands. Also learned different options available with these commands with examples.
- Hadoop FS | HDFS DFS Commands with Examples
- Hadoop Count Command – Returns HDFS File Size and File Counts
- Hadoop Copy Local File to HDFS – PUT Command
- Hadoop Yarn Configuration on Cluster
- Hadoop – How To Get HDFS File Size(DU)
- Spark Step-by-Step Setup on Hadoop Yarn Cluster
- Hadoop “WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform” warning