BDB dataset import sources object
Documents the bdb dataset_import_sources object used with Redis Enterprise Software REST API calls.
You can import data to a database from the following location types:
- HTTP/S
- FTP
- SFTP
- Amazon S3
- Google Cloud Storage
- Microsoft Azure Storage
- NAS/Local Storage
The source file to import should be in the RDB format. It can also be in a compressed (gz) RDB file.
Supply an array of dataset import source objects to import data from multiple files.
Basic parameters
For all import location objects, you need to specify the location type via the type
field.
Location type | "type" value |
---|---|
FTP/S | "url" |
SFTP | "sftp" |
Amazon S3 | "s3" |
Google Cloud Storage | "gs" |
Microsoft Azure Storage | "abs" |
NAS/Local Storage | "mount_point" |
Location-specific parameters
Any additional required parameters may differ based on the import location type.
FTP
Key name | Type | Description |
---|---|---|
url | string | A URI that represents the FTP/S location with the following format: ftp://user:password@host:port/path/ . The user and password can be omitted if not needed. |
SFTP
Key name | Type | Description |
---|---|---|
key | string | SSH private key to secure the SFTP server connection. If you do not specify an SSH private key, the autogenerated private key of the cluster is used and you must add the SSH public key of the cluster to the SFTP server configuration. (optional) |
sftp_url | string | SFTP URL in the format: sftp://user:password@host[:port]/path/filename.rdb . The default port number is 22 and the default path is '/'. |
AWS S3
Key name | Type | Description |
---|---|---|
access_key_id | string | The AWS Access Key ID with access to the bucket |
bucket_name | string | S3 bucket name |
region_name | string | Amazon S3 region name (optional) |
secret_access_key | string | The AWS Secret Access that matches the Access Key ID |
subdir | string | Path to the backup directory in the S3 bucket (optional) |
Google Cloud Storage
Key name | Type | Description |
---|---|---|
bucket_name | string | Cloud Storage bucket name |
client_email | string | Email address for the Cloud Storage client ID |
client_id | string | Cloud Storage client ID with access to the Cloud Storage bucket |
private_key | string | Private key for the Cloud Storage matching the private key ID |
private_key_id | string | Cloud Storage private key ID with access to the Cloud Storage bucket |
subdir | string | Path to the backup directory in the Cloud Storage bucket (optional) |
Azure Blob Storage
Key name | Type | Description |
---|---|---|
account_key | string | Access key for the storage account |
account_name | string | Storage account name with access to the container |
container | string | Blob Storage container name |
sas_token | string | Token to authenticate with shared access signature |
subdir | string | Path to the backup directory in the Blob Storage container (optional) |
Note:
account_key
and sas_token
are mutually exclusiveNAS/Local Storage
Key name | Type | Description |
---|---|---|
path | string | Path to the locally mounted filename to import. You must create the mount point on all nodes, and the redislabs:redislabs user must have read permissions on the local mount point. |