How do i download a large database from phpmyadmin?

When working with MYSQL I often use phpMyAdmin, which is a nice GUI way to manipulate my database. But some operations won't work in phpMyAdmin when the database is too large. In particular, you can't import or export really large databases using phpMyAdmin. So sometimes you need to do things on the command line. So I thought I'd document some of the command line snippets we use frequently. In the following, replace [USERNAME] with your mysql username, [DBNAME] with your database name, [/path_to_file/DBNAME] with the path and name of the file used for the database dump, and [/path_to_mysql/] with the path to mysql bin (like /Applications/MAMP/Library/bin/).

Copy/Export a Large Database

MYSQL has no 'Copy' function. You create a copy by dumping the database with mysqldump. To dump the database and gzip it at the same time, use the following. This will prompt you for your password.

  
mysqldump -u [USERNAME] -p [DBNAME] | gzip > [/path_to_file/DBNAME].sql.gz 
  

Import a Large Database

If you want to replace the database with a fresh dump created by the above process, do the following. First, unzip the file.

  
gzip -d [/path_to_file/DBNAME].sql.gz
  

Get to a mysql prompt (you will be asked for your password.)

  
[/path_to_mysql/]mysql -u [USERNAME] -p
  

Then do the following to wipe out the old database and replace it with the new dump:

  
SHOW DATABASES;
DROP DATABASE [DBNAME];
CREATE DATABASE [DBNAME];
USE [DBNAME];
SOURCE [/path_to_file/DBNAME].sql;
  

Conditional Dumps

Sometimes the search index is huge and you want to omit it from the dump. Do so with:

  
mysqldump -u [USERNAME] -p [DBNAME] --ignore-table=[DBNAME].search_index | gzip > [/path_to_file/DBNAME].sql.gz 
  

There are actually a number of tables you could exclude, like the sessions table, the watchdog table and all the cache* tables. But if you use the above technique to destroy and recreate the database after doing this, you will be missing all those excluded tables. So you will want to do a two step process instead: First, create a backup with ONLY the table information, no data.

  
mysqldump -u [USERNAME] -p [DBNAME] --no-data | gzip > [/path_to_file/DBNAME].info.sql.gz
  

Then create a backup, including only data from the tables you need.

  
[path_to_mysql/]mysqldump -u [USERNAME] -p [DBNAME]  --no-create-info --ignore-table=[DBNAME].search_index --ignore-table=[DBNAME].cache --ignore-table=[DBNAME].cache_block --ignore-table=[DBNAME].cache_content --ignore-table=[DBNAME].cache_filter --ignore-table=[DBNAME].cache_form --ignore-table=[DBNAME].cache_menu --ignore-table=[DBNAME].cache_mollom --ignore-table=[DBNAME].cache_page --ignore-table=[DBNAME].cache_pathdst --ignore-table=[DBNAME].cache_pathsrc --ignore-table=[DBNAME].cache_views | gzip > [/path_to_file/DBNAME].data.sql.gz;
  

Well that's a lot of typing. Wouldn't it be nice if there was a wildcard we could use instead of typing out all those cache_ tables? Well there is!! You can do:

  
[path_to_mysql/]mysqldump -u [USERNAME] -p [DBNAME]  --no-create-info --ignore-table=[DBNAME].search_index --ignore-table=[DBNAME].cache% | gzip > [/path_to_file/DBNAME].data.sql.gz;
  

After doing this, just import the two files as above, first the one with only the table info, and then the data. Result, a (relatively) small database with all the optional tables emptied out. Note that the wildcard trick above is not documented anywhere that I can see, so you'll want to test that it works in your setup.

Stay connected with our latest news, articles, webinars, and more via our newsletter.

Karen Stevenson

How do i download a large database from phpmyadmin?

Karen is one of Drupal's great pioneers, co-creating the Content Construction Kit (CCK) which has become Field UI, part of Drupal core.

The phpMyAdmin tool can be used to import MySQL databases up to ~128MB. Files above that would cause the service to time out and the import will fail.

To import larger databases to your account, upload the database backup copy on your account. Make sure to place the backup file outside the public_html folder so that it is not accessible from a browser.

Then you can use the Site Tools -> Site -> MySQL tool. In it, find the database you want to import your backup to, click the kebab menu -> Import Database Dump. Navigate to the location of your backup in the popup that appears, select it and click Import.

N.B. Importing in the existing database will overwrite any matching existing data. We strongly recommend that you create a backup of your database first. You can learn how to do this in our  MySQL Database Backup tutorial.

Alternatively you can import larger databases to your account using SSH connection. Check our detailed article on how to do that. There are no timeout limits when importing via SSH and the operation should complete successfully.

How do I export a large MySQL database in phpMyAdmin?

Export.
Connect to your database using phpMyAdmin..
From the left-side, select your database..
Click the Export tab at the top of the panel..
Select the Custom option..
You can select the file format for your database. ... .
Click Select All in the Export box to choose to export all tables..

How do I export a large database?

Step-by-step MySQL backup procedure.
Set compression for the connection..
Choose a database for export and open Database Backup Wizard..
Set options to create an all-of-a-piece database snapshot..
Make a backup of your MySQL database..
Choose a database to import and open Database Restore Wizard..
Import the database..

Can phpMyAdmin handle big data?

PHPMyAdmin is able to handle unlimited amount of data when it comes to exporting a table to sql, csv or spreadsheets.

How do I import a large database into phpMyAdmin?

Editing the php..
Login into your cPanel..
Copy your php. ini from your main domain to your subdomain document root folder..
Edit the php. ini you copied to your subdomain for your PhpMyAdmin..
Change the following to be larger than 100MB. ... .
Make the php..