mysqldbimport (1) - Linux Manuals

mysqldbimport: Import Object Definitions or Data into a Databases

NAME

mysqldbimport - Import Object Definitions or Data into a Databases

SYNOPSIS

mysqldbimport [options] import_file ...

DESCRIPTION

This utility imports metadata (object definitions) or data or both for one or more databases from one or more files.

If an object exists on the destination server with the same name as an imported object, it is dropped first before importing the new object.

To skip objects by type, use the --skip option with a list of the objects to skip. This enables you to extract a particular set of objects, say, for importing only events (by excluding all other types). Similarly, to skip creation of UPDATE statements for BLOB data, specify the --skip-blobs option.

To specify the input format, use one of the following values with the --format option. These correspond to the output formats of the mysqldbexport utility:

sql (default)

Input consists of SQL statements. For definitions, this consists of the appropriate CREATE and GRANT statements. For data, this is an INSERT statement (or bulk insert if the --bulk-insert option is specified).

grid

Display output in grid or table format like that of the mysql monitor.

csv

Input is formatted in comma-separated values format.

raw_csv

Input is a simple CSV file containing uniform rows with values separated with commas. The file can contain a header (the first row) that lists the table columns. The option --table is required to use this format.

tab

Input is formatted in tab-separated format.

vertical

Display output in single-column format like that of the \G command for the mysql monitor.

To indicate that input in csv or tab format does not contain column headers, specify the --no-headers option.

To turn off all feedback information, specify the --quiet option.

By default, the utility creates each table on the destination server using the same storage engine as the original table. To override this and specify the storage engine to use for all tables created on the destination server, use the --new-storage-engine option. If the destination server supports the new engine, all tables use that engine.

To specify the storage engine to use for tables for which the destination server does not support the original storage engine on the source server, use the --default-storage-engine option.

The --new-storage-engine option takes precedence over --default-storage-engine if both are given.

If the --new-storage-engine or --default-storage-engine option is given and the destination server does not support the specified storage engine, a warning is issued and the server's default storage engine setting is used instead.

You must provide connection parameters (user, host, password, and so forth) for an account that has the appropriate privileges to access all objects in the operation. For details, see NOTES.

If you attempt to import databases on a server with GTIDs enabled (GTID_MODE = ON), a warning will be generated if the import file did not include the GTID statements generated by mysqldbexport.

The utility will also generate a warning if you import databases on a server without GTIDs enabled and there are GTID statements present in the file. Use the --skip-gtid option to ignore the GTID statements.

To make the most use of GTIDs and export/import, you should export all of the databases on the server with the --all option. This will generate an export file with all of the databases and the GTIDs executed to that point. Importing this file on another server will ensure that server has all of the data as well as all of the GTIDs recorded correctly in its logs.
OPTIONS

mysqldbimport accepts the following command-line options:

• --help

Display a help message and exit.

• --bulk-insert, -b

Use bulk insert statements for data.

• --default-storage-engine=<def_engine>

The engine to use for tables if the destination server does not support the original storage engine on the source server.

• --drop-first, -d

Drop each database to be imported if exists before importing anything into it.

• --dryrun

Import the files and generate the statements but do not execute them. This is useful for testing input file validity.

• --format=<format>, -f<format>

Specify the input format. Permitted format values are sql, grid, tab, csv, raw_csv, and vertical. The default is sql.

• --import=<import_type>, -i<import_type>

Specify the import format. Permitted format values are definitions = import only the definitions (metadata) for the objects in the database list, data = import only the table data for the tables in the database list, and both = import the definitions and the data. The default is definitions.

If you attempt to import objects into an existing database, the result depends on the import format. If the format is definitions or both, an error occurs unless --drop-first is given. If the format is data, imported table data is added to existing table data.

• --new-storage-engine=<new_engine>

The engine to use for all tables created on the destination server.

• --no-headers, -h

Input does not contain column headers. This option applies only for csv and tab output.

• --quiet, -q

Turn off all messages for quiet execution.

• --server=<server>

Connection information for the server in the format: <user>[:<passwd>]@<host>[:<port>][:<socket>] or <login-path>[:<port>][:<socket>].

• --skip=<skip_objects>

Specify objects to skip in the operation as a comma-separated list (no spaces). Permitted values are CREATE_DB, DATA, EVENTS, FUNCTIONS, GRANTS, PROCEDURES, TABLES, TRIGGERS, and VIEWS.

• --skip-blobs

Do not import BLOB data.

• --skip-gtid

Skip execution of GTID_PURGED statements.

• --skip-rpl

Do not execute replication commands.

• --table=<db>,<table>

Specify the table for importing. This option is required while using --format=raw_csv.

• --verbose, -v

Specify how much information to display. Use this option multiple times to increase the amount of information. For example, -v = verbose, -vv = more verbose, -vvv = debug.

• --version

Display version information and exit.


NOTES

The login user must have the appropriate permissions to create new objects, access (read) the mysql database, and grant privileges. If a database to be imported already exists, the user must have read permission for it, which is needed to check the existence of objects in the database.

Actual privileges needed may differ from installation to installation depending on the security privileges present and whether the database contains certain objects such as views or events and whether binary logging is enabled.

Some combinations of the options may result in errors during the operation. For example, excluding tables but not views may result in an error when a view is imported.

The --new-storage-engine and --default-storage-engine options apply to all destination tables in the operation.

For the --format and --import options, the permitted values are not case sensitive. In addition, values may be specified as any unambiguous prefix of a valid value. For example, --format=g specifies the grid format. An error occurs if a prefix matches more than one valid value.

When importing data and including the GTID commands, you may encounter an error similar to "GTID_PURGED can only be set when GTID_EXECUTED is empty". This occurs because the destination server is not in a clean replication state. To aleviate this problem, you can issue a "RESET MASTER" command on the destination prior to executing the import.

The path to the MySQL client tools should be included in the PATH environment variable in order to use the authentication mechanism with login-paths. This will allow the utility to use the my_print_defaults tools which is required to read the login-path values from the login configuration file (.mylogin.cnf).
EXAMPLES

To import the metadata from the util_test database to the server on the local host using a file in CSV format, use this command:

$ mysqldbimport --server=root [at] localhost --import=definitions \
  --format=csv data.csv
# Source on localhost: ... connected.
# Importing definitions from data.csv.
#...done.

Similarly, to import the data from the util_test database to the server on the local host, importing the data using bulk insert statements, use this command:

$ mysqldbimport --server=root [at] localhost --import=data \
  --bulk-insert --format=csv data.csv
# Source on localhost: ... connected.
# Importing data from data.csv.
#...done.

To import both data and definitions from the util_test database, importing the data using bulk insert statements from a file that contains SQL statements, use this command:

$ mysqldbimport --server=root [at] localhost --import=both --bulk-insert --format=sql data.sql
# Source on localhost: ... connected.
# Importing definitions and data from data.sql.
#...done.

COPYRIGHT


AUTHOR

Oracle Corporation (http://dev.mysql.com/).

SEE ALSO

For more information, please refer to the MySQL Utilities section of the MySQL Workbench Reference Manual, which is available online at http://dev.mysql.com/doc/workbench/en/.