Take a backup or restore a backup using commands
Users can utilize BACKUP
and RESTORE
commands to export backups to their storage buckets,
in addition to backing up or restoring via user interface.
Commands for all three CSPs are given in this guide.
Requirements
You will need the following details to export/restore backups to your own CSP storage bucket:
- AWS
- GCP
- Azure
- AWS S3 endpoint, in the format:
s3://<bucket_name>.s3.amazonaws.com/<optional_directory>
For example:s3://testchbackups.s3.amazonaws.com/
Where:testchbackups
is the name of the S3 bucket to export backups to.backups
is an optional subdirectory.
- AWS access key and secret. AWS role based authentication is also supported and can be used in place of AWS access key and secret as described in the section above.
- GCS endpoint, in the format:
https://storage.googleapis.com/<bucket_name>/
- Access HMAC key and HMAC secret.
- Azure storage connection string.
- Azure container name in the storage account.
- Azure Blob within the container.
Backup / Restore specific DB
Here we show the backup and restore of a single database. See the backup command summary for full backup and restore commands.
AWS S3
- BACKUP
- RESTORE
Where uuid
is a unique identifier, used to differentiate a set of backups.
You will need to use a different uuid for each new backup in this subdirectory, otherwise you will get a BACKUP_ALREADY_EXISTS
error.
For example, if you are taking daily backups, you will need to use a new uuid each day.
Google Cloud Storage (GCS)
- BACKUP
- RESTORE
Where uuid
is a unique identifier, used to identify the backup.
You will need to use a different uuid for each new backup in this subdirectory, otherwise you will get a BACKUP_ALREADY_EXISTS
error.
For example, if you are taking daily backups, you will need to use a new uuid each day.
Azure Blob Storage
- BACKUP
- RESTORE
Where uuid
is a unique identifier, used to identify the backup.
You will need to use a different uuid for each new backup in this subdirectory, otherwise you will get a BACKUP_ALREADY_EXISTS
error.
For example, if you are taking daily backups, you will need to use a new uuid each day.
Backup / Restore entire service
For backing up the entire service, use the commands below. This backup will contain all user data and system data for created entities, settings profiles, role policies, quotas, and functions. We list these here for AWS S3. You can utilize these commands with the syntax described above to take backups for GCS and Azure Blob storage.
- BACKUP
- RESTORE
where uuid
is a unique identifier, used to identify the backup.
FAQ
What happens to the backups in my cloud object storage? Are they cleaned up by ClickHouse at some point?
We provide you the ability to export backups to your bucket, however, we do not clean up or delete any of the backups once written. You are responsible for managing the lifecycle of the backups in your bucket, including deleting, or archiving as needed, or moving to cheaper storage to optimize overall cost.
What happens to the restore process if I move some of the existing backups to another location?
If any backups are moved to another location, the restore command will need to be updated to reference the new location where the backups are stored.
What if I change my credentials required to access the object storage?
You will need to update the changed credentials in the UI, for backups to start happening successfully again.
What if I change the location to export my external backups to?
You will need to update the new location in the UI, and backups will start happening to the new location. The old backups will stay in the original location.
How can I disable external backups on a service that I enabled them for?
To disable external backups for a service, go to the service setting screen, and click on Change external backup. In the subsequent screen, click on Remove setup to disable external backups for the service.