”;
This chapter describes how to import all the tables from the RDBMS database server to the HDFS. Each table data is stored in a separate directory and the directory name is same as the table name.
Syntax
The following syntax is used to import all tables.
$ sqoop import-all-tables (generic-args) (import-args) $ sqoop-import-all-tables (generic-args) (import-args)
Example
Let us take an example of importing all tables from the userdb database. The list of tables that the database userdb contains is as follows.
+--------------------+ | Tables | +--------------------+ | emp | | emp_add | | emp_contact | +--------------------+
The following command is used to import all the tables from the userdb database.
$ sqoop import-all-tables --connect jdbc:mysql://localhost/userdb --username root
Note − If you are using the import-all-tables, it is mandatory that every table in that database must have a primary key field.
The following command is used to verify all the table data to the userdb database in HDFS.
$ $HADOOP_HOME/bin/hadoop fs -ls
It will show you the list of table names in userdb database as directories.
Output
drwxr-xr-x - hadoop supergroup 0 2014-12-22 22:50 _sqoop drwxr-xr-x - hadoop supergroup 0 2014-12-23 01:46 emp drwxr-xr-x - hadoop supergroup 0 2014-12-23 01:50 emp_add drwxr-xr-x - hadoop supergroup 0 2014-12-23 01:52 emp_contact
”;