S3 Service Parameters

Enterprise

Bacula Enterprise Only

This solution is only available for Bacula Enterprise. For subscription inquiries, please reach out to sales@baculasystems.com.

Parameters to connect and control the behavior of the plugin regarding the S3 service:

Option

Required

Default

Values

Example

Description

endpoint

No

https://s3.amazonaws.com/

URL of a S3 endpoint

https://192.168.10.4:9000

Main URL where the S3 service is being served

access_key

Yes

Valid S3 access key to read from or write to buckets

KMN02jCv5YpmirOa

Valid access key to read from or write to buckets

secret_key

Yes

Valid S3 secret key associated to the provided access_key to read from or write to buckets

bTq6FzPbnU9x1jqka5STRDnz3CPLouyq

Valid S3 secret key associated to the provided access_key to read from or write to buckets

region

No

eu-west-1

AWS region code-name: eu-west-1, us-east-1, us‑east‑2, eu-west-1, eu-south-1…

us-east-2

AWS Region code name where the buckets to backup exist: https://docs.aws.amazon.com/directoryservice/latest/admin-guide/regions.html

force_path_style

No

No

0, no, No, false, FALSE, false, off ; 1, yes, Yes, TRUE, true, on

true

Force requests to use PathStyle (http(s)://myS3host/bucketname) instead of HostStyle (http(s)://bucketname.myS3host). Enabling this parameter is typically needed in the most common implementation of some solutions like Ceph or MinIO.

thumbprint

No

String representing a SHA-256 SSL Certificate thumbprint. It supports AA:AA:AA… format, but also aaaaaa format.

D8:B8:9D:B1:AD:3E:37:FD:72:10:94:A5:0F:AC:AE:62:0D:BA:EA:D6:12:21:5B:7D:99:27:63:05:91:12:7E:3C or d8b89db1ad3e37fd721094a50facae620dbaead612215b7d9927630591127e3c

Specify a trusted certificat.e For a HTTPS connection with an specific endpoint, will get the certificate of the endpoint, compute the thumbprint and connect with it it matches, without considering if the certificate is valid

bucket

No

Strings representing existing buckets for the given access information (endpoint, keys and region) separated by ‘,’

mybucket1,mybucket2

Backup only specified buckets existing into the provided endpoint (and accessible through the provided credentials). If no bucket is listed, all of them will be backed up

bucket_exclude

No

Strings representing existing buckets for the given access information (endpoint, keys and region) separated by ‘,’

Personal

Exclude selected buckets belonging to the configured endpoint (and accessible through the provided credentials)

bucket_regex_include

No

Valid regex

.*Company

Backup matching buckets. Please, only provide list parameters (bucket + bucket_exclude) or regex ones. But do not try to combine them.

objects_from_script

No

Existing executable path by File Daemon User

/opt/bacula/scripts/s3-selector.sh

Run dynamic command that will output a list of strings representing S3 Objects of the configured bucket that need to be backed up

bucket_regex_exclude

No

Valid regex

.*Plan

Exclude matching buckets from the selection. Please, only provide list parameters (bucket + bucket_exclude) or regex ones. But do not try to combine them. If this is the only parameter found for selection, all elements will be included and this list will be excluded.

folder

No

Strings representing existing folders for the applicable buckets separated by “,”

images, docs

Backup only specified folders belonging to the selected buckets

folder_exclude

No

Strings representing existing folders for the applicable buckets separated by “,”

personal

Exclude selected folders belonging to the selected buckets

folder_regex_include

No

Valid regex

.*Company

Backup matching folders. Please, only provide list parameters (folders + folders_exclude) or regex ones. But do not try to combine them.

folder_regex_exclude

No

Valid regex

.*Plan

Exclude matching folders from the selection. Please, only provide list parameters (folders + folders_exclude) or regex ones. But do not try to combine them. If this is the only parameter found for selection, all elements will be included and this list is excluded.

version_history

No

No

0, no, No, false, FALSE, false, off ; 1, yes, Yes, TRUE, true, on

Yes

Include former versions of every object into the backup process

acl

No

No

0, no, No, false, FALSE, false, off ; 1, yes, Yes, TRUE, true, on

Yes

Backup object ACLs

disable_hashcheck

No

No

0, no, No, false, FALSE, false, off ; 1, yes, Yes, TRUE, true, on

Yes

Disable hashcheck mechanism for file integrity

glacier_mode

No

SKIP

SKIP, RETRIEVAL_CONTINUE, RETRIEVAL_WAIT_DOWNLOAD

RETRIEVAL_CONTINUE

For each object found in the Glacier tier, select the action to perform: skip the object, launch the retrieval but continue the job or launch the retrieval and wait for it to finish so the object(s) may be backed up.

glacier_tier

No

STANDARD

STANDARD, BULK, EXPEDITED

EXPEDITED

Glacier tier to use for retrieval operations through Glacier if those needs to be launched based on the Glacier mode

glacier_days

No

10

Integer greater than 1

30

Number of retention days for the object(s) retrieved from glacier

date_from

No

Date formatted like: ‘yyyy-MM-dd HH:mm:ss’

2022-08-01 00:00:00

Backup objects only from this date

date_to

No

Date formatted like: ‘yyyy-MM-dd HH:mm:ss’

2022-10-15 00:00:00

Backup objects only up to this date

storageclass

No

Strings representing storage classes of AWS: STANDARD, REDUCED_REDUNDANCY, GLACIER, STANDARD_IA, ONEZONE_IA, INTELLIGENT_TIERING, DEEP_ARCHIVE, OUTPOSTS, GLACIER_IR separated by “,”

STANDARD, STANDARD_IA

Backup only objects stored in any of the indicated storage classes

storageclass_exclude

No

Strings representing storage classes of AWS: STANDARD, REDUCED_REDUNDANCY, GLACIER, STANDARD_IA, ONEZONE_IA, INTELLIGENT_TIERING, DEEP_ARCHIVE, OUTPOSTS, GLACIER_IR separated by “,”

DEEP_ARCHIVE, GLACIER, ONEZONE_IA

Backup all objects, but exclude those stored in the list of storage classes

Note

force_path_style option is available since BE 16.0.7.

Note

objects_from_script option is available since BE 18.0.8.

The following example shows what is expected with the objects_from_script parameter:

Script example for objects_from_script
 $ cat /tmp/s3-selector.sh
 echo "FolderA/file1"
 echo "file2"
 echo "FolderB/"
 echo "FolderC/file3.abc"
 echo "FolderD/file4.def"

 # The fileset would contain then
 Fileset {
   Name = s3FromScript
   Include {
     ...
     Plugin = "s3: ... objects_from_script=\"/tmp/selector.sh\"""
   }
 }

Go back to: S3 Plugin: Configuration.