---
title: Import and export data
description: Import existing SQLite tables into D1 or export a D1 database for local use.
image: https://developers.cloudflare.com/dev-products-preview.png
---

> Documentation Index  
> Fetch the complete documentation index at: https://developers.cloudflare.com/d1/llms.txt  
> Use this file to discover all available pages before exploring further.

[Skip to content](#%5Ftop) 

# Import and export data

D1 allows you to import existing SQLite tables and their data directly, enabling you to migrate existing data into D1 quickly and easily. This can be useful when migrating applications to use Workers and D1, or when you want to prototype a schema locally before importing it to your D1 database(s).

D1 also allows you to export a database. This can be useful for [local development](https://developers.cloudflare.com/d1/best-practices/local-development/) or testing.

## Import an existing database

To import an existing SQLite database into D1, you must have:

1. The Cloudflare [Wrangler CLI installed](https://developers.cloudflare.com/workers/wrangler/install-and-update/).
2. A database to use as the target.
3. An existing SQLite (version 3.0+) database file to import.

Note

You cannot import a raw SQLite database (`.sqlite3` files) directly. Refer to [how to convert an existing SQLite file](#convert-sqlite-database-files) first.

For example, consider the following `users_export.sql` schema & values, which includes a `CREATE TABLE IF NOT EXISTS` statement:

```

CREATE TABLE IF NOT EXISTS users (

  id VARCHAR(50),

  full_name VARCHAR(50),

  created_on DATE

);

INSERT INTO users (id, full_name, created_on) VALUES ('01GREFXCN9519NRVXWTPG0V0BF', 'Catlaina Harbar', '2022-08-20 05:39:52');

INSERT INTO users (id, full_name, created_on) VALUES ('01GREFXCNBYBGX2GC6ZGY9FMP4', 'Hube Bilverstone', '2022-12-15 21:56:13');

INSERT INTO users (id, full_name, created_on) VALUES ('01GREFXCNCWAJWRQWC2863MYW4', 'Christin Moss', '2022-07-28 04:13:37');

INSERT INTO users (id, full_name, created_on) VALUES ('01GREFXCNDGQNBQAJG1AP0TYXZ', 'Vlad Koche', '2022-11-29 17:40:57');

INSERT INTO users (id, full_name, created_on) VALUES ('01GREFXCNF67KV7FPPSEJVJMEW', 'Riane Zamora', '2022-12-24 06:49:04');


```

With your `users_export.sql` file in the current working directory, you can pass the `--file=users_export.sql` flag to `d1 execute` to execute (import) our table schema and values:

Terminal window

```

npx wrangler d1 execute example-db --remote --file=users_export.sql


```

To confirm your table was imported correctly and is queryable, execute a `SELECT` statement to fetch all the tables from your D1 database:

Terminal window

```

npx wrangler d1 execute example-db --remote --command "SELECT name FROM sqlite_schema WHERE type='table' ORDER BY name;"


```

```

...

🌀 To execute on your local development database, remove the --remote flag from your wrangler command.

🚣 Executed 1 commands in 0.3165ms

┌────────┐

│ name   │

├────────┤

│ _cf_KV │

├────────┤

│ users  │

└────────┘


```

Note

The `_cf_KV` table is a reserved table used by D1's underlying storage system. It cannot be queried and does not incur read/write operations charges against your account.

From here, you can now query our new table from our Worker [using the D1 Workers Binding API](https://developers.cloudflare.com/d1/worker-api/).

Known limitations

For imports, `wrangler d1 execute --file` is limited to 5GiB files, the same as the [R2 upload limit](https://developers.cloudflare.com/r2/platform/limits/). For imports larger than 5GiB, we recommend splitting the data into multiple files.

### Convert SQLite database files

Note

In order to convert a raw SQLite3 database dump (a `.sqlite3` file) you will need the [sqlite command-line tool ↗](https://sqlite.org/cli.html) installed on your system.

If you have an existing SQLite database from another system, you can import its tables into a D1 database. Using the `sqlite` command-line tool, you can convert an `.sqlite3` file into a series of SQL statements that can be imported (executed) against a D1 database.

For example, if you have a raw SQLite dump called `db_dump.sqlite3`, run the following `sqlite` command to convert it:

Terminal window

```

sqlite3 db_dump.sqlite3 .dump > db.sql


```

Once you have run the above command, you will need to edit the output SQL file to be compatible with D1:

1. Remove `BEGIN TRANSACTION` and `COMMIT;` from the file
2. Remove the following table creation statement (if present):  
```  
CREATE TABLE _cf_KV (  
   key TEXT PRIMARY KEY,  
   value BLOB  
) WITHOUT ROWID;  
```

You can then follow the steps to [import an existing database](#import-an-existing-database) into D1 by using the `.sql` file you generated from the database dump as the input to `wrangler d1 execute`.

## Export an existing D1 database

In addition to importing existing SQLite databases, you might want to export a D1 database for local development or testing. You can export a D1 database to a `.sql` file using [wrangler d1 export](https://developers.cloudflare.com/workers/wrangler/commands/d1/#d1-export) and then execute (import) with `d1 execute --file`.

To export full D1 database schema and data:

Terminal window

```

npx wrangler d1 export <database_name> --remote --output=./database.sql


```

To export single table schema and data:

Terminal window

```

npx wrangler d1 export <database_name> --remote --table=<table_name> --output=./table.sql


```

To export only D1 database schema:

Terminal window

```

npx wrangler d1 export <database_name> --remote --output=./schema.sql --no-data


```

To export only D1 table schema:

Terminal window

```

npx wrangler d1 export <database_name> --remote --table=<table_name> --output=./schema.sql --no-data


```

To export only D1 database data:

Terminal window

```

npx wrangler d1 export <database_name> --remote --output=./data.sql --no-schema


```

To export only D1 table data:

Terminal window

```

npx wrangler d1 export <database_name> --remote --table=<table_name> --output=./data.sql --no-schema


```

### Known limitations

* Export is not supported for virtual tables, including databases with virtual tables. D1 supports virtual tables for full-text search using SQLite's [FTS5 module ↗](https://www.sqlite.org/fts5.html). As a workaround, delete any virtual tables, export, and then recreate virtual tables.
* A running export will block other database requests.
* Any numeric value in a column is affected by JavaScript's 52-bit precision for numbers. If you store a very large number (in `int64`), then retrieve the same value, the returned value may be less precise than your original number.

## Troubleshooting

If you receive an error when trying to import an existing schema and/or dataset into D1:

* Ensure you are importing data in SQL format (typically with a `.sql` file extension). Refer to [how to convert SQLite files](#convert-sqlite-database-files) if you have a `.sqlite3` database dump.
* Make sure the schema is [SQLite3 ↗](https://www.sqlite.org/docs.html) compatible. You cannot import data from a MySQL or PostgreSQL database into D1, as the types and SQL syntax are not directly compatible.
* If you have foreign key relationships between tables, ensure you are importing the tables in the right order. You cannot refer to a table that does not yet exist.
* If you receive a `"cannot start a transaction within a transaction"` error, make sure you have removed `BEGIN TRANSACTION` and `COMMIT` from your dumped SQL statements.

### Resolve `Statement too long` error

If you encounter a `Statement too long` error when trying to import a large SQL file into D1, it means that one of the SQL statements in your file exceeds the maximum allowed length.

To resolve this issue, convert the single large `INSERT` statement into multiple smaller `INSERT` statements. For example, instead of inserting 1,000 rows in one statement, split it into four groups of 250 rows, as illustrated in the code below.

Before:

```

INSERT INTO users (id, full_name, created_on)

VALUES

  ('1', 'Jacquelin Elara', '2022-08-20 05:39:52'),

  ('2', 'Hubert Simmons', '2022-12-15 21:56:13'),

  ...

  ('1000', 'Boris Pewter', '2022-12-24 07:59:54');


```

After:

```

INSERT INTO users (id, full_name, created_on)

VALUES

  ('1', 'Jacquelin Elara', '2022-08-20 05:39:52'),

  ...

  ('100', 'Eddy Orelo', '2022-12-15 22:16:15');

...

INSERT INTO users (id, full_name, created_on)

VALUES

  ('901', 'Roran Eroi', '2022-08-20 05:39:52'),

  ...

  ('1000', 'Boris Pewter', '2022-12-15 22:16:15');


```

## Foreign key constraints

When importing data, you may need to temporarily disable [foreign key constraints](https://developers.cloudflare.com/d1/sql-api/foreign-keys/). To do so, call `PRAGMA defer_foreign_keys = true` before making changes that would violate foreign keys.

Refer to the [foreign key documentation](https://developers.cloudflare.com/d1/sql-api/foreign-keys/) to learn more about how to work with foreign keys and D1.

## Next Steps

* Read the SQLite [CREATE TABLE ↗](https://www.sqlite.org/lang%5Fcreatetable.html) documentation.
* Learn how to [use the D1 Workers Binding API](https://developers.cloudflare.com/d1/worker-api/) from within a Worker.
* Understand how [database migrations work](https://developers.cloudflare.com/d1/reference/migrations/) with D1.

```json
{"@context":"https://schema.org","@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"item":{"@id":"/directory/","name":"Directory"}},{"@type":"ListItem","position":2,"item":{"@id":"/d1/","name":"D1"}},{"@type":"ListItem","position":3,"item":{"@id":"/d1/best-practices/","name":"Best practices"}},{"@type":"ListItem","position":4,"item":{"@id":"/d1/best-practices/import-export-data/","name":"Import and export data"}}]}
```
